var/home/core/zuul-output/0000755000175000017500000000000015157171240014530 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015157200314015467 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000237356615157200235020274 0ustar corecoreikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9Gf >+l"mv?_eGbuu񯷑7+%f?7ݭ7֫^|1Fr_?c^*߶E٬:rv筼ح_y~̎+\/_p/Bj^ֻ]Eo^O/(_/V?,<']_kmN:`Si{ C2i1Gdē _%Kٻւ(Ĩ$#TLX h~lys%v6:SFA֗f΀QՇ2Kݙ$ӎ;IXN :7sL0x.`6)ɚL}ӄ]C }I4Vv@%٘e#dc0Fn 촂iHSr`岮X7̝4?qKf, # qe䧤 ss]QzH.ad!rJBi`V +|i}}THW{y|*/BP3m3A- ZPmN^iL[NrrݝE)~QGGAj^3}wy/{47[q)&c(޸0"$5ڪҾη*t:%?vEmO5tqÜ3Cyu '~qlN?}|nLFR6f8yWxYd ;K44|CK4UQviYDZh$#*)e\W$IAT;s0Gp}=9ڠedۜ+EaH#QtDV:?7#w4r_۾8ZJ%PgS!][5ߜQZ݇~- MR9z_Z;57xh|_/CWuU%v[_((G yMi@'3Pmz8~Y >hl%}Р`sMC77Aztԝp ,}Nptt%q6& ND lM;ָPZGa(X(2*91n,50/mx'})')SĔv}S%xhRe)a@r AF' ]J)ӨbqMWNjʵ2PK-guZZg !M)a(!H/?R?Q~}% ;]/ľv%T&hoP~(*טj=dߛ_SRzSa™:']*}EXɧM<@:jʨΨrPE%NT&1H>g":ͨ ҄v`tYoTq&OzcP_k(PJ'ήYXFgGہħkIM*򸆔l=q VJީ#b8&RgX2qBMoN w1ђZGd m 2P/Ɛ!" aGd;0RZ+ 9O5KiPc7CDG.b~?|ђP? -8%JNIt"`HP!]ZrͰ4j8!*(jPcǷ!)'xmv>!0[r_G{j 6JYǹ>zs;tc.mctie:x&"bR4S uV8/0%X8Ua0NET݃jYAT` &AD]Ax95mvXYs"(A+/_+*{b }@UP*5ì"M|܊W7|}N{mL=dC' =MS2[3(/hoj$=Zm Mlh>P>Qwf8*c4˥Ęk(+,«.c%_~&^%80=1Jgͤ39(&ʤdH0Ζ@.!)CGtGLS0l/LKcQ.os2% t)Eh~2p cL1%'4-1a_`[Zz㧦|k˭c ĚOρ_} Ewt3th?tvͪ{~;J0= |JUԍ;Iw}/9nh7l%>'ctRm9r}1j#e[tRQ9*ء !ǨLJ- upƜ/4cY\[|Xs;ɾ7-<S1wg y &SL9qk;NP> ,wդjtah-j:_[;4Wg_0K>є0vNۈ/ze={< 1;/STcD,ڙ`[3XPo0TXx ZYޏ=S-ܑ2ƹڞ7կZ8m1`qAewQT*:ÊxtŨ!u}$K6tem@t):êtx: `)L`m GƂ%k1羨(zv:U!2`cV, lNdV5m$/KFS#0gLwNO6¨h}'XvوPkWn}/7d*1q* c0.$\+XND]P*84[߷Q뽃J޸8iD WPC49 *#LC ءzCwS%'m'3ܚ|otoʉ!9:PZ"ρ5M^kVځIX%G^{;+Fi7Z(ZN~;MM/u2}ݼPݫedKAd#[ BeMP6" YǨ 0vyv?7R F"}8&q]ows!Z!C4g*8n]rMQ ;N>Sr??Ӽ]\+hSQזL {g6R/wD_tՄ.F+HP'AE; J j"b~+'h=TԫeVިO? )-1 8/%\hC(:=4< ,RmDRWfRoUJy ŗ-ܲ(4k%הrΒ]rύW -e]hx&gs7,6BxzxօoFMA['҉F=NGD4sTq1HPld=Q,DQ IJipqc2*;/!~x]y7D7@u邗`unn_ư-a9t_/.9tTo]r8-X{TMYtt =0AMUk}G9^UA,;Tt,"Dxl DfA\w; &`Ͱ٢x'H/jh7hM=~ ֟y[dI~fHIqC۶1Ik\)3 5Ķ']?SؠC"j_6Ÿ9؎]TTjm\D^x6ANbC ]tVUKe$,\ܺI `Qز@UӬ@B {~6caR!=A>\+܁<lW Gϸ}^w'̅dk  C 7fbU{3Se[s %'!?xL 2ڲ]>i+m^CM&WTj7ȗE!NC6P}H`k(FUM gul)b ;2n6'k}ˍ[`-fYX_pL +1wu(#'3"fxsuҮױdy.0]?ݽb+ uV4}rdM$ѢIA$;~Lvigu+]NC5ÿ nNჶT@~ܥ 7-mU,\rXmQALglNʆ P7k%v>"WCyVtnV K`pC?fE?~fjBwU&'ᚡilRї`m] leu]+?T4v\% ;qF0qV(]pP4W =d#t ru\M{Nj.~27)p|Vn60֭l$4԰vg`i{ 6uwŇctyX{>GXg&[ņzP8_ n` *3UP0Sp8:>m(Zx ,c|!0=0{ P*27ެT|A_mnZ7sDbyT'77J6:ѩ> EKud^5+mn(fnc.^xt4gD638L"!}LpInTeD_1ZrbkI%8zPU:LNTPlI&N:o&2BVb+uxZ`v?7"I8hp A&?a(8E-DHa%LMg2:-ŷX(ǒ>,ݵ𴛾é5Zٵ]z"]òƓVgzEY9[Nj_vZ :jJ2^b_ F w#X6Sho禮<u8.H#',c@V8 iRX &4ڻ8zݽ.7jhvQ:H0Np: qfՋ40oW&&ף \9ys8;ӷL:@۬˨vvn/sc}2N1DDa(kx.L(f"-Da +iP^]OrwY~fwA#ٔ!:*땽Zp!{g4څZtu\1!ѨW(7qZcpL)ύ-G~^rFD+"?_h)yh=x>5ܙQ~O_e琇HBzI7*-Oi* VšPȰһ8hBőa^mX%SHR Fp)$J7A3&ojp/68uK͌iΙINmq&} O L-\ n4f/uc:7k]4p8wWLeUc.)#/udoz$} _3V6UݎvxyRC%ƚq5Щ/ۅw* CVo-1딆~ZYfJ"ou1ϵ5E bQ2mOΏ+w_eaxxOq:ym\q!<'J[FJ,4N:=6. +;$v6"I7%#CLTLyi{+ɠ^^fRa6ܮIN ޖ:DMz'rx#~w7U6=S0+ň+[Miw(W6 ]6ȧyԋ4ԙ./_A9B_-Z\PM `iĸ&^Ut (6{\٢K 5XGU/m >6JXa5FA@ q}4BooRe&#c5t'B6Ni/~?aX9QR5'%9hb,dsPn2Y??N M<0YaXJ)?ѧ| ;&kEYhjo?BOy)O˧?GϧmI C6HJ{jc kkA ~u?u7<?gd iAe1YB siҷ,vm}S|z(N%Wг5=08`S*՟݃*־%NǸ*kb05 V8[l?W]^@G:{N-i bɵFWǙ*+Ss*iނL8GkmjX((cvb_DSˮy}% _fK- B=UI QϞ^ZݭH(^J舸/m!AdEڗG)uoj#v;#(XհkmA BO IkI㤛rP*C" 6@d}HW1$q}dP!NZgҽ9m*(Hk^fmIx nf^L71"0(H%5"@&]fR7Q{vX S 7ʑLF.w#^蜮 2φ +"(LNO]H'_\]_=η_\)UK> ۣ,j|r6oSU;BKŨv!➴`5TDiy@}?:MC!1ns9{9@-=X(i7=nAH2\Gd$&iA|iSQiݸ||?Ofhg$0N/}F]\2ߑGRLIT|CO1[љiCIjO9_ԇEXB=R 艇*fyh~YvX<ӞgSAzٜ>ajO_փYɓHڦd%]NzPAN\b*K ڤUy""&D@i{(WL ]pWk鰒^BckDSIIå+4@{E/=?[tLV}cO͌:.c!JC{J` nKlȉW$ǝ)Y\Ҥ$tSL{yYa$ H.>t~qlDeTh4xpFڜT@p +j/V&P'$U ?IЎ@}Hó*ת-` 9]r -Bўe!o c=h $ /ɓzD@<ctq[kMDSj,FK.35B$ڄT3X"%0<# m'9݋:|<{H@#*hGYpSQ<1=iSNiAp(-I*#iq-GigQ\"Iܓiҧt!*x) 7ff*N7gUg޻P_ԅi6;Ę^U & hz=8;s-7bҙ+`JG PfDGUhrHt < 15F8) 򅟤ŏdW>5Kd%퐛'nYmtYͼ#$ڴ7njӞh2H[eﯧOgd_|HH焋"#%'&,ItoHڔ6n`R֑&.Nv0v]I˟vrM}F5X|FI#g.Gi)%!iOto}|յn#y 9D*Azi/2h o7}'),-iX'|=eWgU-Ns%טCEпGe먠MD"+3@}zwn:^wM` cZ,{Oǔ6JyTwfhH,KuM-մzJ>FIY*%3HgRܿ-#>ǷVӟpg~V]:х@J]fVw,w}ߠa73uTaKtȃt筓+5Znڷssi7. qCU馥_޴oo7Oi'?^ͮ?h1٣{qxST?pۗOJ 5<"~|ܽ`Ѿ#_NmvE0t~8-2[ޙn4Ӱ3)lv2ȔϠT*lA" Lb~T ʚY1E-$vPqGf8 9Fg Af$x5/1H2bw;)iv!+lR@;e=- 3F%TKVUʆH'8֡,%I ʚEAZUVM)I`jq="6mv3 ȴs:ZmV?* -8nh؊r OO~M )q7Zi^N+~=Ț[ͅwI1d[X]Lc!ޒhoU ' 4ISùM4jC6Tpb?//fׇ&!B ;&g\,}F)L b߀Mi^+0Κ_pL/qo:hC>xR2=ϫCy~` ^ y!k!9yL%VLU2gr26A!4vS]7p+&ee *w -`JڏG|/GMvc!m5p;tϧy'gpŝlCcB*ϗ%VD+"yK5c]6uF= hQhhPsBÅV@6F \ A%WLly]K᜴{XjⱽΝٿ'=ҥ)6d8$<$qla)n  $uT;wE]2cM(%YjDktByx" +O71~jp.d1O9Þ%, >Ff~=)([')ڢBɽ|)N;\yygjI\l`^1 `qacSX@116jqo?O1+|lNϷQ22 ~ Vb}pE$|A}KjF6,9tV|̢T`+K*"_R_كrq IH!6\F6!ܠqK$DR\ͻ~uPq5g.}kS^ݨ{e!>XDŽ:g&ӄMu!<|ywOʹ`O7[F%mgXgP~bg*Deh$cd2˳:%NhfӫŶ$Au8Y4>7Ϋ~te^|gzt~NG$*ADo> IcuQ-̧Ͼ9@rѣ%zs b#s@*иrI1kGUG%|Ix=I}&ݢ6ɢ-o:5#DIC ;[]^J\vm!z ƌm7ke>\RRaIV'¶6!=8?8oZY|Uv`Ƌ-vo|J:9[v~\:衡pO`c IcjlX):_ EeV a"҅2jB7S2t=t).\aQcb^CZ-uvpr!(dv^'5|X/nI-D!PltsHDwQ$zzBvQ0h} -_>7޴kӔn,?W1;{|,ۇ=sx+@{l?.*+E>1]8B*0)QsU·BS&vp/Χ6I[Ux8"EȿQGa[qVmSІ Y$9F_h6~߮ )ib+q!EXFucYgcV>&w6?H+NL$]p>I*eOjpcm{Bl9vQ.OT!9U}W冨 ;])>6JdbXԠ `Z#_%.9VF[7Id8:W5>*N>KStE)KS1z2"l%^NEN? _4ILAٍKK7)O,:خc磶Fcݒ”^h*_G juİZsQ ~!GAxg_$A|`e)B QlvKlXt҈t9JXXqdl[r9RǦ:q5E](>Z zZ1&8G9r޴T0=Cj,?V~:3] ;Y[OӬNb1{8+7%L1OUaY쎹aZHgi |D `%޻I>rd31V_Sh])AUqػu\Mڗ鸷A+A.A~&'f2*q0âZEqrO| \56cTAnOFo^ X]joC!Pu!Jm l?Qac_>'"Bg<" 0H_-AnG =q޵^Ų gwpГz]'ť$:fr;M-e ՠNhfG8>Vڇ RAٽ9| cbpcT?x]aU {ӋG ނ1v_/EZ~'*.EΑ9U.ϊ/,9怕:[QcUyUrŽ XRjwflѓ6.ܮCy*8¢O[9bu) O14B`.z͜u-ss>Uݴ SaSK§ GT6&l`GT~ꢰ\0P8_)Z]k5>.1C( .Kp| vä+ kj· dM[a^ $H;M $YǫU>?<UݸoRV >IsawF\b+s~p"eʰ(zZ=.!BjѕFdpUna"Odb *75:&C k1ͤ#O Rۘ– Er/G/UcAPQT]|XN X]^F Ŗ:ޔ&+@,{3T\X)|*HN'e*h0:VumBl ۏ `9AgPF_Ѫ2)sCj1T.S0Z1:?Wy9egI+bK?&#I3X\WGZ3M`AI.pH6xm`Rs _Jt@U8jxɕͽf3[I3G$)ՖbG(}1wL!wVf;L|14jغRqcZRݹJ$]!:YF2cU(7B~ ;Wi+vwv-_@q)+?Dobtm4Sxb(9A `aRs ޶d6'XA5?V_W puȇ;s`uih _F2% [U۴"qkjGX6)_+(f?\T)* &9V(]"tJ8۷)g3J1n`ROu~}#Ѥ#r !J0CQ v⯥ho1=V T:_#OV+kG\8Sz^'툜+OqFǤSCǔl X1\1:" 0mtH,>7>a.fpU`ZR֩bK'`tTiwm* "Qi+ *mDtH-ʐ?sk47iIb3Ώ%TCv}e{̈́=I;iƊc2J1TN>7q;"sդsP[ kW`u!8Rj.2hgWsE.,uDΡ1RºVݐ/CBc˾[ shGI 0Os\l}`pΕ qO-ˠ{'\ QuaBn|L@drVec>$Ȃ1L-'{뭄GdɱL ;V[bp>!n&աI̱Sx!shjuL P Ӧɇ~t#K1pVi8F'+1dc&xF 2侯}>tiDpU`%7iTH .Y[L'y}Jm2$EB"{3cMmhipEI:59sTz?[uvcD-~V,.ȍȱHEB:p`\E)jlȔa|)nɲ"Tq?E8V 7z[v_J~C4>''Rc1-V RtzJ=sۄ`g?7̪ #`u0V<s)/=rnlg9| RD1౱UR}UR,:ơz/lvc& GHwMlF@a4D Oj!;V|aq>0*s%6)L?$ća$."T#yqHhlك&ٕEt_$d:z2-\NR#cDB/eWzH 1Հm -XıUXFr\A-2]6u/ųz Z?ڢV)-&!8vL f2D?#y8Vk[~;DSu>6nQ qf2Lυi l-傊וzF"daf]>HUd6JG`\g2%tJ4vX[7g"zw\k>kqBbB;t@h)Gရ[-rnl-wgpn]Y#&ߓ_SGo_&AJ烠a/f_Œ8aJM6MY(3ͯl~l8V0٪T zL{As:&EXAn 8Ugݗ^Os RTnp{PZ )`S!9| Z*7XibR${˪AokOr\y$ ^X9@:ZIh~" "cZ{6_| %wq65\|6NQKr)"Y^d̐,9Wְb/$pv.3U]p `*`'+b9JJ NI闷#G:%e/ONݰ/4 3__pikme#͗ŕcb,\Ӱ#0peqe9#xY\zd^tdm0MX냡k, H4iʹc.Ÿq\{@V<+ ]/r^ַq<9dX֫|#BZbİN&6y:x'ˏ0Z8L/H* p.3;1Ӌi;Q~h$~S)CṮǤ!>8aiaK=::{]kg pGxdik־ZZUo2 sfMJ7LJwtrƖnp쵫ͣl:jЂVAa1`=כGtMb?Bh?^E2y\[w՗XoǏ9B.c1r%PX2As}e_eK"`RU't7̂rR4k[u^{Hu2b)Zةel EؒRU%ZX{XB~;\lUfQ5~Rq Ȗ &PB1Kކ?ZxgjjZkM vr vbKㅘɇsQrȪe|Yd% sT'U|V L$gg$0`0F׬tv.rw7(,’}n6wEVTb m 壬n}Y8X7buvyACW]1dY4 q@~a-B9M4/0>wg;FEC0/ѓ ӪqKIعw$÷H@~,maƥm  %l0"P7+2b(;|c4ضXS-XqqO ѰjE˵4$(:|;+Ʒ 3}pi|J[ ε81~q_e%t`ܸFӋs""R1wg <~j{dqL 93|rc̠MjаtO/x]Db ̋sp<粐]:y{Q2)l"bz}*x)yb٧/'SLCq~>hS#ӹ(d5H.Y(ҷr!;Q͉aΠ<* uL(`}٥OOڗQT#Z*H1oRE;+{ 4@'47ol 澺jX25fDʞ2zը8઻zZ RCװ3Fga-JaU\myms:=.sKU˚%Қ?|vmK4qLE qo/xw/<ߋl{>]WE0ʊ@j)M-hڽk.JzbyP;GF2}Ha5` eB:Ngo_@4x K$Ԟo%UKk:5W9fBjU!2nxf\˳};<ۺ bFY/۸Ҁ0-mV[䱹l[i-Y!7Rs%:G4¶p$-^XuXވ\CْwcUXWZҰD0w(>H jQ!e>I46Wm{i.嵔9zv1R$P͕)<۹ U \[ePc/pF[aGUT7Gs8 jxɚ*A$GĥFpM2uWq4j3dždž j3wM/"K^ؐc_[<=~>LO*OydZٴlLFPLyzunۮ%59g6Z`EC<AWAkAR9OU"|{ߵX5V4ks9ɂ\[c]xX. q_[c{J^ p$г}ò=yf$\bہoYc"y4S%ݍ7Ci} )f5`e"!Ց#-ɠZ'ẃe-]IM RgeO&(<:eiZEFREm8M0]m\uUFx|6i }Ё%yڔ-+|?kb&ƾZJT 6e.ѾC> սC!]UN>zzxu-^ʋwtӎW9Efi {rg7I;e\BX*mHՃw/PRA׎E/rWEi:e ucD+eu.ta*tA蟞k4ª=x6?_s4\MPHNH= c:ϣܝXg%&VYԶ9cؘv.D3i\'?e\e`Vi#!STKY f SSd> YUyO-Gܴ < [/|f9 URef=5bluuxF۳b{J+R9T劐\޶ҮEUgI6[~٪NZI 2e~$N+o8WuIҴ6˦Q i:a"`o=`bhF`V0iE\'\Ơ-9zm9K!awg4(1:sa2׹}L#NH8@0RMSu]E?h_- ;9ܰ`782~$cyn] s6+μ$9븶z'7d<Il$RGJv|w )HɠcMI$<}ipKS;I#;93~#g6?'Y+`uEYOo&`WMe;9|1g6 ,_pcCʼncՈG;zQ4(_:ȣh{7-Ş[ ϵ@~&w?--r?  -}ճrڭ {l H?O7ǯ+ !_>dMU&\=.niNwӤ&Ducn_ȸHG=N3)bxtI\&'C|kJ2;ZeοpR*4@ab>EQuMTl3hC<eOo(̃F4M'bTM*A>/1@ڷh_'yޯe!R ߽>7^_:8e'86 t4,qm~陜J"o@wY@ΐ| 7i7XH}s_|ܙFcJl+Nno@pojx%2:wsw[TL_\>/x*O0Ye^`m"xѧN79 ~^g \ޱ3|y(6EҺ88<Էj$yWwB*g;wCh.W3,.iskzqQA7a1#PB/i)H-܄|89;͡q"=ۍYohnc!0 qr3CZX=fLg=ӔI<;Nj4ĉiC:"Y!xh2q"6[tDJt{It\ NU%D~= 'KOI&iAb0LG%EP+iX \U6aUFVOef^7r\<k9=\X$ jqk%S{rVpa%X,lȠ^|?Osgxͳr2_1 #Be&-{,}G͆Rt ̾ċ\}-89 0~!b!+~v}WD1 Nԑ@i2ɋ;$F ,ulD&rsìN'ӓlR솢Y^LP?ϯ3de㧨j\!.2j *b4q=G=ΐSwg%\Ggq/gqA%Y!aHi><0` G۳ `3}4=eAi4'V-UwF?g <Ɖ ד >sXVpJ ih$,Բ84:{/ jY|o( {"A/nʡ.g{V˩k[BYY]1E܂<=. pkbԵ64\m(ML0$k _cنFȶGϧT M==6Y. +G]'!Aga͛ 7E/ yBXIāYy@:)/1$ aGڋ0X۔ڣڤ/ߜ:vS2dü+m]NMצGA‚lmjkP?цԳk{a`'Քd=&ӥ 0p8dU~SŞ8 " `h=ϧ9l%ͬ4/2[)*5%ci wDj Mr }6\n=0+9>d@[&σAm]B ʕL&Η 9*B[]~#(]n!(^P4A鎂m]RM*>VJՊݪfwڧΕMei²eeO()BPkYPk A&˂[jo/4AYBPg{A ( >MPwGA-^Piz; m!(_o!(^P4A-^Pi; t!64 X[H!,͈!cUq!Ϸg԰9LW,=t4V=Lܤe^gtQ6]r LBJcHԈ$7ˤ(IX$H cZ;$XqSH۳w>'ck#wYR+ϡzgi7i6"2.My߷aQwz&9fP4σa(59lZTކ, EG.8%= iG|$7О^я3~b'eYx>S,Q]'"[ZQ]` юEiɟ+!.et:K}[y&pi.^JI4# ![D &};7p{F{(D*[i4JTO .H!-35gmwN/pl<4{ `݅!eET?<%p6G)Z%30Z@y%p>: ᰌe) Y[_ugg#3y,\dx8LMP3 LIR~U-1OS7\cOp,1ǽ!Bp~*,+7IY3[8 ~"| Q9aivD0D_ ywA`@*_+$t@%kn )a+ JDБAO252z.kqzag)sE0{ЯQ] Ȟ#0ƕn[ kNY(Ҹ|NRlw_]pW%dzjeazVg; WIJW?LθP$L(Z` ibI!SUM^-Kz!oSI լj=:Ʋ`?"J,+N2,lN &D$\H2dDlQ,$K+f=9gjuNCkXqtxuP8\!֪3XhLHʻ,kH aQWA),R ]>*h§>ӧwpAE]{8CrDOJaB) ASEF{ȚEc`BT4d#aohj@W\th,:$OvVԫ|r0IB @DqɌlE+EXu;q|H0bXqo0y1*:ͧ3Mu+'*gGUV;UmVɐ0 KxEi*a"K*c_5kYߪ/Q߁EC AW޻jH*Gsy:Y_|ʔGČ);Y渧՘XW}#*^,*iߢ}g0/>tcmNT-֠:r1[9U( QWM{튏;x βi—Q=&(Bnau~H$h:juM$j/b;t#3FV,6/d =`ʁp}}[$u JH":/g=ւZTR6*(uS&Q<Tb]mv1ƴ[lԣH S6"c'T8-rte%$C'͸[J~u5*v(nFDMV Rhus"ro}>E߸%FXB 1ZsijG"J i=u4N}X8Zb[]rU "x]nS UȣhBhi,f1r*X's{@K|t q ]y:WVJ-,m⫵V ]ՙ951E|/zy/y Ǹ#dYU8e=BMMWѶey`%eF&WN뷵) 5][9fUq'%]_ E;2G,*e‹(O#Z%kKdEVꌫY"<5P9w:S*]ҲC,,s˜&ٰw*f!͐8N)?)ܜ6% n ;)Edo 'h*qP'в>H2GٰUx_VKb*.ls" ^ZꦓUv̏M@,< dZϊD Hf:̫aHw=P|l:$tA$wGseMK҅s4=zE'eN)X#HVz@' t؉љ$4s@$ )3:Ք 1Hh{Mcqe)Eq(J*5(Y F悭qN t1 Ҥ ''W1PIS4I=i%*d4˶0N2Ȩ1x`V+'I ܫ]pc/d) ZݨA "qYhEÑ"iヲv8eтi(=_~BMT,:si3I ?UN FV`\[J,.p΅LsL]"bU9Ggk wN90%]] eJT*-'ELwUw!XC3vl~5[Ts/Vc4EES$~"CX)ާ~ /E$$_]z=\PxqIc,7h*65P$uSToXyȼ{1|Z*0P`Z6XpP]9Vi+vps-TVQ%T[S`rѕ7psW 0)@>`:ýQלZ5j.2GIpX3"DrBf ]9TZutEI=tI85֔ c?*K`+P~uiamM#{F7_F8[R&=:fl }wzZ jp+@#R3A);`-QZ:` +4\bR$AW]?/UvHukKD*GKa9F 'i Ûeo 5Jm}I=S5ţKaYBB΅\ Ԙi"i9xc=|=lp{eLyB2&s=67"OL܊rqD?iJMʃ'.B&N\R$-|DǬMM܌A;SʿC]ptZgo.M=pT::mzzfH4QҺOvhWql}[ol>+E2uk8,MXEv *\ rZUٔ#&]):jpEH[=Mw⍦/ӭﵰP]BGݻ$ܬ6_&7TyEjwm+r[9aն?$02r#+u(+[ށaYr'Q[buBO7ϫnyQ^vz=e5koA0l+\ j!Fc{I ix]p?ԥ8=]'$4O[xNf-U) EL;m ZN$(U')uBoڱB*vUi+%61\6Eg@)v(&X$HZǻGTs{t?vSDw]U'L>lน aSOMV5\/x:(sW5cեtGu;΂7}WR-`FX{}p胧o}VrukWU9H*t s\ʎ=Tda`w]#43 mQl':xR"L>HWuuo=R(I<n>^?.q3rM)$*G 7 [m?v7Ka?/uA΋0$sW r#k(S`|"Ivw-s}dLpqil2& .GFtfl*U୅08x>C DLuQ$vgྐྵEĞj8<'dH hڅv軀8xKB4=1 0P%e%ʇx+8ؒT6/ ѝ]^S\!&)"ϡlIͥ}lGXVf9/m8m-řCf&o$jPq;qkAZo6QvQ\W,bW;v{؞!j4kڎrxnͱ;y+Zb޿N! opk Eq9)U+YxnjkK0T"`˄ϊDHy6ܳ[>!ף˙)kiT2z5Hp.!50 (s|[n-|^:e]=& Ub\)ߘ0x+H0 qmCgڂ /*;YTT/i\K]xV'*a[kˆ֣N+vJv.j8Rxz[ptQi|~S w0u0gf1w)/7Wr1s)DӍ~]p14e?|ҳzvUxbSz0_DO֫> C? 0rSD*:"hiK$/c6P. _w6Mb~G/?CN_UҬ2ڴ$NtqpL7r"'>*MCh7]ZU N8Ark:?{VVۤx@/c_\܍ݳ,Y \aܙY 37iiH+q{Q֫N"c>ʹ3$N3Eag7j /˖e ^pEjZL/ݤ`n/C|nQGY(Ey]+[xa?]!2o'WIgc/%,3; P$(I6ů@7NM|q{e,!(g0c|i7yOT+QaԋafF#co/i$) bN5*C|`|Y +;O`꼃͔! Mw  L~7x]]o&.&*fލo)Bc [NOK"tfqE$mf'5+^bTn0}YY9h,(̖pӍehAnĭcy1`A:o?&4+D#dМ`jp-<}.F)qM D-(*bL>*q=YY.\\Ǟ+bkCAVU_ј<`KR=?zz|3n'?ٽ}]\U{_fGxDshp\9auʹJ*`?SuD>YqpQ2{~X^C.`+\[쯟O˿ꚲ~;]yyq]YAsP^ hd|>hc;[G+!_@ u.3$ٻs=j2>?'/=.FFqCh[KbM֨d hfdoW`߁ihjV )DqM4w;eeV1JO*ֶ+%O,Q[KF]3#B$,OH.L`: G K5ljWv<FrpЫOgLGЩ1Ĕ"\JzBY:V} fUǬIJUrM7.sfdj4*q7(fٿq~ޤV0-ͱS:ѴWNs5rumlrzk"i"A(;7T)ГuooeX%J[gn x69?W8끋3s'8QMeYZbN'ݙU^tbxud eDDW p,,77WލqDD&M=͐I"zp*,5q,i`iWA3/'1Dŵ#vN0fـS#ׯ;Yc|EW "uUX|2 Ds1gA r_gVeË/}Ilw;k̯k'7#VEq3$N=n] 9͸H˄jҊ䆎xr$QJX+b~\WLz5xS9qP'Kudj*Px@d8Ъ=<ah!Nq ֛S0"ُUX6_}0+h7mEܠ0Zo)N@i1-WH6[`Qĕ-b XfLh6DZY=`"Ot /ѓ0M$oq*NHqM9y$u6,(ײ]sPզ9(X,5kYVE6 $(&:9>#js{5&zS^sfCԻ#?iWT@X_֬Q8~r0]aT6NYdWPJ%"AoXm܈F x?,߻(D͘XC'~qu>?u;(GgqOE`y7 3ꐳHs +59o˫.߾}k36sh| }v< &:x}@]oC1X1ovyBjk+9Z)fMNG`#}]++qu/05"De dsگijtKէe%ԉ\aBǂ0WJFg]_}`T(aOfkM\ sFJGvNyUNaRshE!u8 qvҞ:n'4xN\[c3{O~^㴈g-sb<.NE@/{hst@w-0D:?WS!WaO;tgM>!kނT !ڴ =g\% Ҫd< T{]e9BeMZWM9(F] ӓu"xiLZ4%¸0$k8:UQ}`B{% nOC ؞%4|O#s^ ضy'npv2oT=ûꥊt oZ?}kkBX7A|We- k3uUrIF~h_Kʗc@Ey~ Ncyl9\[?lsdon&K#Q>-=Ż&|*<}/{"7ۍ5iG4iX]>BIh])+IcOu+trx ;Y0ŵHri4aef%9˕4l$()D`DQR?p菢sߡ Ax.B&v }\ě=?,/'+$Pz| 4Gk2|Cc_-1úwݢW8B_M$@z&@6vn䂷D{\;KRm֢.=q\þ ` 'T upup\7jAC)=^ /\Ω ?50r|\Qu^N`OhN^jzB(O,ѠB(輍^EBM^_,]iI uFQG#9ϸT|tq;)m'>9mք/9M9N[e;[jIJ^ʥ]c𝐘֪U|HwH#:y%峹mOR\Zr=ce"rcjcKQVsՀP c-q5e9C e9AH \ 4T=myi)lN%\xA$ %` 7T'7ػ1%+o m,-'@A-sN3Me-zjcBZDqm˜`&[䭂؇Mduv1Cp G >'WF\2Sy|})OvgJ)bM,D A~-Pl &K7wDn[uۘ&R9ɧ+)Ex"Dҋ1a%LS+`%Fڼxڄ4A#EG>. v 3, FK6>>qZ)D.8XpD#6a!10rXeъ,XJH~vcU-`Nӷm==kİXx+%BR\ .PQDd.n496ƹki?g+dQZDRCsp%ZH zscC<DA"-f4<S QFFV (i=[vg@\{7]*Y̼u dT slY`3( a4 gC7"OvG8c +@C)x8Ud`)~}!|\>SJ Bqnbn(=4/G8(x Dz6ocKu a] oH+<Hٻld0&"[Z=< ߯%a=V8ؖD5X]*eLd~&< #T=sb̕T0Hʬ212%ȓSM2+8X4}/¡hm81"q!DY4".F8q[Z!S$,ZSMtfF=DB: %X05 "4fAH/oFրVe6pvY}Ы fT&+)u@kZ&+HXAQlP\34aTHAc$ BRhEHplR}W ͧy} SDp=]z-h!U/& &`n;5h4lxpS~0 23g"M$ZDReQdIĢpKl3Հ/?Qit\SL《8eZ P,2mIQ*qdi%lIHR%:É2!ۍMsCƜ,1XBJ@ˈ(R1Rmy-15f_~qG&ۼYr5:C^xS(C zjKxڇa*- 9P>0a l:'\ܰJkt.}qռEC`b?/ p:O $SV@[,<7y;(W$?5Z9ޚCq+g|l9+׭+E82K{kʜw CkTb6 US6`R|i{?J5JE?o]Agyuq=[.Frݝ)E+5"x[3m0&^51ȇY"F3Iݣa2*BՂF7[$^ܚxd2:[;ɼ̴?o4% KS2TfQTu3<џMu绛_ʟf6-n+~ʳ#{OK{n<`;ͷoXrI;{8+zguꭙ\>çf o ųޡ3w^.OCw^y`r6ą'vsjpEZc]D\B);F^`g.j75޳#v,7!'Ѿ^Ϝ3a>{sa |9$Q⥜msBH6t0.V,/VgW rGbE\u"Yb%^ap_qx=Tϱ9ҽ[1dx3/3=@M&'f||xw S|J xVث#EݓM'~ O/L>Q/mF2^D_6\Ѱo@IhUoF\se˽CW'U-Ee}> g-t%r6FMڻ^y8~3w4@pEf:(]M!M\p}/SwE|YI1Ӌm 9 \rhpRlp1Pe1Γk|hu-<cv{T LƂa,xL_v/D3TkxvZ`-.hk ZRd4&" !+cMS,RN{_QH)OrMA.airK,$U\#:z8+~l L FR!X5jn215bgy [~w,&ٮ>Ƭ dݧӔF|[O&uɝoWd+F<~9oB1[H^ & [H~p BB/>Ժ2%@U`;9GEcPeY4x3t'UVA]^u1 .)^)5fYß,0G.0 q`lroB51A뎁ǬY%nj֜4,1kz;3LȦbd|`2q`lx=pZ]zFkg Ȟv= r=qqP"l^r3]'M{[9j. L+̭YoYME`u="YלU*݋~c<(L5{Ii~Xw/^p5b^в~3a>m2>y(PdFoXU6$a'Ƴ?.SAJ٦<2:6.R ƶz"[ή<]8)E%*CJd`t0u(BoT[Wew.*|˙AW.}ag ~ C}:"m~qT0a=+$?Р+$/5|]D+7_.бjQ/GəxLI^5Pº`i^%`n)oI񮌎΋fX?ЙŚzܡ/FW֯VSDP+1UskJuq:\FUr@U ]E}u"x.AwYlf&cE9\uw\g'x,>Uڪ]2_XI uF|p o큒XxMEb1U Jht()_%%wO\֞dIJtQRTZhHjNs|[}iOdR<œʺdQ+,>!2~3qKJl:4y弬\4#<@|61dT\ 6NJ"vUm D4)h#:5եAk{m^zU'@UEmWvE qE%fkh* M%b֏#LPIR1:; +1A' \<'s̈ ĉ l^lp|;H+\DQtXN%J,Q:%Z)V|#xhͼNs'<]ܺr WO̓K?=y_ =s37g|m=$c%6ITdPRcb&D\nY̌3?q *,捛yCJ,YK>r\Bȩd[XV n9W4O|iQP$qZ7ެyuBC=\ie3#J2(A\*N,z*dT[k1!(LNXlQ0ǃERRvx&. CD:j4r2*0 Y͠R_hAa% Vgfᩨ.Z.2*hi"S+,ߖD%F 2TxPQd|HDžԍQ`I=|Y1rZ+i"%JXƒ,%ݲ݃%V8[51` mDy,y`%Ē 4afʀofl{Fż'}nr7~ϴܘ7s[=i?7,Pqn^$BӁd>dyV\>O3n[3̿b:+CC^ZuӇ/:OܰT^ 6vK\gwڧumR pu>Oo`?qRpB άRc-8|3/KsR lifCMfǬ_O|ۮj4q725B{~~,G{?OGf!U@\ә7:_ ~eWM.9x6OBА/}˗XS6!lM~c$3b&}Ul V&Pj@FѮ5Hh~*=F2Le!2,o\ڡtptMA]pރ^V0i+LYz]0>5: SA)D+S-VTjjkBiI ;70ZRJvcve~]qJQs 3+"HodnEk*+NkY*n׌#'Pk,F8 ȘA !HeT Kj8n*Pv&h NMmDZE5Ho,WQeUUl;shm8E~;c( ȐEa*!cWdhٞ;)@ɛFMhi"%Tyo*Ɓm;sO#lS(٧hD{pR91$ g;_0G3G2V8n Ll *֬Xdc[i9jB ڒѡ "  vzNg ̽tb~v7H Dc 3?+) $m!)]ɇtCP:Q λG NjD D@a*,#sgiJ7>d1}VVW֌hcƦc R%P)<2"LLVg;M0zP"Ukt U TIlvM:.:<(V4뵤rV??@,e4y1@a3| ' " %b@7PJZ@ ,bɕ/>Iy3KO+e':ۤu (TPU;Y f2i͂Śg|K56FhH Ĵ a}@fHLz_deŲ L(`9iRR"A ,$P<Щr8Pl?|P3F)0T$Az # t^",9KY˟Bڄ]pqug|%D<"HDnq+r H"yZG`ⵊX-*HoDnG"ΛDv"/^̯Sq8ZU8 ⡈_)fV:%aJ\6BȚ4B؍nYf$fˌ !( 9d'!rHF8®db *+"?Bh脅\TTX\MkX@`M`XP5Ho Ia5I.B1#`=];b|}'kCrU}NIK1 n (0',a)Sal3qq&@gL0@ K0A' RV§FDNy`&E#6V1KF鋤Au%`U%Iot.%$5JjTH(t{7+wvJ,$]܎?\}O? Xk=Zz'84)y<-.9xNwtƗ޲a uhI|!cóۂo' A&T|̸bVK2d=jK넬tXt;GB٢A-BZ{%2RGϗ5!JV+G 1WNδ IlJݬC#iEfukx3BaG鍠%X+U$ ~ڡ3A[k@K"Xp@:JA#hm -Jl8n5<`88 Tts5)ykt!e# gYv<[6! 5t nMi6Z)Gn-HV'4 ()鍔6O.T9M|M59"* 5?JDj6H~$ˮDEx+ 21ly\fz=.6llfl7|܄…9] ëH"vpBw~?5oA57e+/qGnxAU|Oo$5@+8Rlu~xL%=M~GnܝF*'B, ?&U='➟lJ+{HC[[W=<$\]Ӗ/J l2^_5:_ X bZjGo6wbkoÐpX/Cͳſ0n7m0'솻t~=♵kVNYr ׷ 2I:RbZx֙Di,hT:NZ8Ft_8()#LDdq c20&øaԜqyV5iFxn{+W2{ְdžkcޏ1U3*}bv,mȟFM-8̋c'Э?UY 'd-Jg9ch42IHs<ϲhG̲(Ҵ7-z|Ms"\~|ug^T&MiZJ~WY DGI\ wWSc U)NOI#:jdpqk gqkgG:Ic G{c-՞ꚑ131x!q^e[ǁT[?Hj}XtPgCkg!Jl$٤ GZ[ðE?-uiQ-=>te^|ߍk-ɺz8 tw7f֪nt|q£_vO}k?ns=9楷r``-]U|?Ԇi= iy\,Zp.y,{(<;v2Csgiu2}Α4nqv@8/.}7 g=z18%5 ᡭf]I|8~V9a3k Itz3:rxx"c.us+=чp;pptZ@]m^8gIsWsV|"!:sm}iQc[3d7Y`/Ed,eFo=Eh5<3rÀ,Mt,"t`\ l#u֦RB|@nNLZ+A[0Z<9ݎYޡ9'MRht @y"UDºXlPWhUh `ܝ{68؅(4ZJl~Qgh+D\-,,O_5m[>LU;%ds@Jrڒ!]KYC0I^)J7sbAvAOU0=d})BY4V)0fA cUs8< fc2$1ȯ n;cZ\e塵"uOS. [0My"1XQfZhbtQg;r"@*D1^A2) :F<&3" uqyOT ˭dPEᝧGE[33}qׯxi*{fxngȗ}czCa+r{ZFݭoNfF3ؤWS\ °H\?R2O~_0͏jhj"o=1HK'q6UqiN@T}ͻx# 0T*@Dp *Ὀ`7*i0jO0.O08a0D0m*9;DZXUG+&mٺO lX+ a[%Z)C^cYqJKhBÉfz;2fu6㇛7x}zGq[?LKQ煜2ٲeN(~W-%αn)1S5Dm;֪ʐ(>~viĎXfdgة7#b%ZG%z32Gb|+0u>ͳ[|+B2&ؒ7A1R>o]d_4-: -YWf]E/iВ*u5CbP؞iHI# !LBZ37angn:&؎ Ni@;܄ 0܄MYZȞgnꙛz榙,r7-ӕ$LHe<@cks$Peeͥۃ< dµ5L١ĆKmϓ&m/ Pŵ 5J{+/Ѹju7 V\"gZb 0_|ɗ4ʊ{Ε\l|fĆʌ0`=1nJ։ai%s+K@@X 6j>3"O|D-63ZC3#{3glT~LqFkÉ,pli]Xsm5Iݴ@?5xw^$ @K}Rfٽ4Y3~'eRvde <$-jI,z(ӽN&oe8|tfEH2;Ӄϴ5TDhS*gR۔L.jsPTLDb>l1 Z2A@dж7Jݜ"$7,vgO=.7xM.os1g^Oߴޒ"5}:` %mLrm`3Y{C4#C<1RvngPrj!Wn q8DL˕P#ԽC"-5 Xk%$h`3 3@KҏZvQo/p ؒu髆^aK~Qb M/*s;c{a905^q*W&\:^d [&'N~{qۋh8Jzg#(MT4UO7Նn {nXp;)<]ɢȨ) G)qtSA7-i0 7{2+JJTk\`!: Wuf֊Ot\25jͱ^=Gٙ~>#Y}rwEq2I&HݏV?D q d|?Cc2zᘲeJЮ2 }g 0QWf{/wQbZ>gpNK7WRĔ\S8_(|Gs~r2#Cc;IqP7)\1F zNV'cEȖ5:c!.2Br`jƺOuNE~`,>Ps;"pd t60=KDx<_; i%POܵ70(B/\f'BNc(#M1|CE(-ueӀC ƵguDYL8n|@vV\"k3v=ɭtZS:H͢b&Ubhu5(}uQ_Mh/xǫ5w?1J09 lY~^ob@qa4]-C]uFG\vs :wfBIɠ]pC#K=Kwő?jP'|xWS鬺f[RGs׷6 nop7rF*ڝѿ7i[R5.woo9oS?v8e0X6Y[7pӛ'vfxuk ǞJ߯gdU~1^㏤ʦ #"O"StB,+qRۍY:\)7t#-:q~PC:2Ńǟxp_xgRp_.o˻/₝p?:ꪰ0)"HS z' @K%+eW%w(1:7 $ki|C[oF_/hGoΩQg~y+1+*t,-IYZ]@Lte~Eߑh棣JV'X-)@Qh/[QH9k2e^K+{]K,K4y$Cb*0,QrjoKh:1 ]hE)Wed*YxF V_mK&iZ/ J.ɴ "QO> ]cSgWȊ,:ȶ }bdk<5ds裊l;aa$PuDE<RGQ )j e%p6ͭUe ,MdUDD+h6)JQ_{-$ALIiW((WN0ͤEicVj}UHmɊF@iJ#h.km#DzE`1;46r%GǢdY~ȑ]%+&8vtH^{.CF0(t(w!\Xp!'-`522r̕RrIRAD2lޯ҄wz    Jh܃i2<T yIG)1$qg.!,`pYda`k8R Z*!d`$P8oLhG=px;A; }@A.:LIlpȤ``gRv@nR`Fۖ*r"AB- HSQ^FA;;EfpV 4EBZGo1ANGB]v DL2 H # VhѰ~ $$/x43j܀+7ⶈ Q'\z ,:=(J`$daw$46!!e) DXnYCE8dn3!'օ<}yցF0`= j]5ŠyTDw!֛ |k=oe!&")},̃1:h6X2Zypdayt +aTt`LڌrF$Z̷+hnyp!qp"yP<2z'y`0ALt7f^@?A j wGz %YD"gAi1@yAZdK'\ }4n:=Akՙ2qD#kQdfeM[T(Z*[$ ~[d;ZZ7h@ cwLympFĐ)Z:wh#ta1ngxᩛ44b{2H=B&bP-)-_U<=p4 VKcf]t>p*ߊn  lhY6)7!#!v]HACo$Di˔|U"ϮpX;*t0P5KB7vvfRnmXRN}U i w!LHrJ`?/l=/#%16`[ mH7s]Zy[fvYsp͡2M&yEfÞWѭ6.w,lڠt`w<[,)e}mAO;.?Poc@\{__oQzk u@BMMWnV*'',-Gs7 r:.x8^)`UnTFUnTFUnTFUnTFUnTFUnTFUnTFUnTFUnTFUnTFUnTFUnTFUnTFUnTFUnTFUnTFUnTFUnTFUnTFUnTFUnTFUnTFUnTFGi-Ind)Gn^M1r#F@)UFUUS:US:US:US:US:US:US:US:US:US:US:US:US:US:US:US:US:US:US:US:US:US:US:ߍPw7Wkni^ ح~Ok0JCJqZe/JS";jYZvӨkȏv>;My d ~?aRB"YÅB*"I15].Aز6[f` !y(XXB!`%}N VVRjV-F˼,1v tEOަ81~0n$ԜNSw}/I`粅c,ERҼ*,RX.yNbYIJ1%Z CJFjKY+7c{e'D*+M!`ԒB2`FTVHK,R:Brky!`v?*fb=gi!`y[!`Lb{RϖbJiyi)f`}`k S9csYcj,{ch |'}2Z!p Qnrch3}C{ӤbϦ'D+&ՓsM쮞㺫'Ezpw4[!pW 즧Lٲyðg^g||;J,,Ňu/:]U-J@%6L0xe}6^h׿W7]8l:[,aqz} e 9Z`޼t|Ͻh$vb2o=s~9A';]G{;omwjjl'6[Ƴ!,/?O9JD@H6|zKn?&fnlFhs}Em|\\y,7iۚU]4glүǃ Z)!B7bU,D$N?]Ll2FR9Y/&sB#ٰo:[>8itKzjBa9٫w}a7dr9ttgpg;Wn6 nX}OW;6vp#?fUL)Bz)58/z?wyŇ!~ϣŇYbA?`~/o~e5X-EaN&|KV |CnZ6xmKo.M>۵10!t_%/4v0>[cۇ-ܯOy&s-3HJGbw#Hn"$٬MmD [DC`^'|ØNCN֧DbKy=/S&)]]fԒn-,=mퟜ)E Xmo?Pď94&DH765%B$ry,q^^!RYC㕷_mOkp<<-:}u/_X9oAௌ o.-V۠y/ԔTmS4EtRtE?L TV~+o̅XwM*{[2!ߦޡf!jnM Ht4) F>;^DZWGedkAРt@Έ/oxF)!/y^! !wkgo}` sS aȇG:{ڪ - 4A܊/gZ_B.|;dE>W6KŸ~^|Fdp'}Zd<<~S\} $1(uddUk(X3-@KkT^2%]yoe#ޅs c^hڒ)R׆ Wۍ.,w\+j,BCZ߷j}^B4{t/4n2}j^ȍ!/O:OO> !ľPw=l ;?Lhnx[ >~=7*n7^o_wW.DZ#$DXgL#O5^4,՛o/$#kKl"іo[1 BkTSsMy:0o~|,O7zi710\w϶x.ewyz9<؜Ȼ3l$vKL|MfsާiC^-F1R<n?J<t_C <Ƕͧ蟞G$txh FOuL8nr9kU6XrNA2R!8BSr0`̗Nmם-'V*JOq>-rƚ@%ힿaھa\al?F`ΟfF:qt`kq1b[=,fPZtP)Cm<o>W%XX)kʂtADf*Hf q``n5r8h`9Ym 1F \kE]vR2̓i2 ]B"-c h@ 9j@;M V _a[[gD\6DE oM%ls oJo0$P"J3g1 9i*ށ.N/z`pwm<6"$(f]i`;q`;0=!2xVGxXPkqp^|}!+^Њii"GqmuFu΂JN71v@|SNwbs8wt25kga)Y(\J[&.86_&neS}_08ZՂt!j fNlG }ƛmOp*j?&$bZɚ9V F6#a`n,xὺY'^!=bĞ|6A%%*%,ՇfB`ऍYH~B#mNhx_|@ wTag3*g9j/0)oGCj=*jhH%3ic9FGrxOrS7\1F9xdhjRKx{jRm15qhEI`"YDKi4j\̆@]N $tm&hAb,]3uDXk&VĨyj=D(jRIxOo62(;mƽA ٞOVz`,;pQ1##>w\E[<)q[y8(0qh;r8q9R f1B #CUlԖt4 L jUcՆ.UN.( d>&9GW3s~ٗF/sS(5ѿG@cVKDpc>[V nn|z4Qkw!عH+]<F=mD ƃ/O V.>ö)S}9MXωҦHi56CJ*0=WI{04y~Ç{b [pw$ NƸ7pW6c8(Z-r$[)qv>wh@2ԼvjȊzbK= ![Oud#W:z9QqF^֕I=1W7-J/̚w֠-/u2’hG8.Gƌ~z}BEp%pw/ٵ`F* JCr`wg1gͳkx/%^}OS%U82x# X<_PLA#ԏj`DP@!Q<7> QVNGWMR&.Q~JoQH1j8|47C}e>T-54 Aȅb)ގM 9lؼnD${H >&ӌJ;mWYKĘ^c*O`|Bx#X$4EBϛb)cn;fsx_QYkNJP:^ZK\ɧ17ƉZK%Dѥfu)剦XveH}'wH+J/& ;DXX|4(iFק`Vr+1zB6 Y'isyPs@ThYidqBxY#.LT'#B{NB MӁu\l\ cQN z0Ͻ`c|oCBɿ-eCɜT]<5'.ΛHZ3k&FܲeL[#޼Gr}'CwlhpֻSu%RHƓz߮|(BG3ʁ ءh0xdjQXά֎x-IO 9r{%t 1uY-cL rVQi"$NLoT9jTա_[jWA%2 ,859ǐ_!f^Eఁb,$W`F{c95BGS{B./SrTԣe1G?#\,ǼeQ2u e9ʳic>bL99#9 4T_4_#P'S ZH[<$ib}v/'o' z\fOk~1ί?tp1-c`sj.z]_F#6Oޓ59W%I9}`;o)H}IuP")5GCއiuChwt5'*/bVnݣ"2F94^"ң4wHY*̠v`}:w..&i?s=^z40IE#!IG42=-%(\G(A#h$#7ˢG$d.p&l2;ݽ,.3. UHSR0y"8D8F[NYtM$"ӺP#J [( f4b2:eA ѤtmP0o h}b, E[ fPfr i.n5 \]fV$ LXa>.gF*/ {g!{I0A}W^c2DewF ¨vX7tȇ.>*DzCP΀a6pchgBqe˜\$EPsydp:h:-ߺBjH2 uGgPqg,lY@ 5ț! uM0spJsM+T![4ueh+ݜCqpԽ t=+i. w}n!AysPw !ɐy' fvH8zZY"AQ <鈪{~!gjR R`qP飝cOeS_$pju'غEP!xF2d2YMr2Tqp# 𛶜y.7]c?w';ޮm2[?ofwWt:.]8?Pcİsʹ5X]Ư*Zbؗvjǿ@O_s>?_G2i2aZv?UcjgO&<8z(jdAd YH'Pܖfd1]k'?;w P<0Ўu\h+N2on[LiFM d)u_BO0V傚P]cAیڂ4'sqBQ02F(X6'c:x c} .-W9$@ZOd Ptz|Zmp jHz1o`YAw"*6e"68 ʮ?q70G+G!C9Z-cmepPi4#DBPKOěsQC9Z-c">oq^5uGP>2 kJRrӺ=@tgr|&4 \8ɉ\ zAAPZ-c׮gǧibn 8ۺKw=zxۼn^x v%m.i f<5Ȩ8Zoguz5k 2ƐV$!ecHgF? {DqV vV;B8 rZ(GRZn֚h>|:DKԚ]V QttO.I-qg h`?!] ;%H 5dw0p bY9T]3`rFPe5jA3WbUHb"QU=dt_us ()&10:/60 /2IM_tt}cxX4zr@?wS ?xv$q{i`-~Xf-"YrL g9r Va9ʸ/! 'F joӓ G'z J3UX^rnC f}^wjsKNZPryx l2/'mX%1V 4NV1|N^6!g70 x)Aף5~ū(IJĭED]'P1K`,)tm`"is}njĄo$b:Ҩw2iVĎL e6f}h)AwR/*nf7BG/n_1 =tފp J:h$< ]Ijx+ zQGAPVF!& z4 wLTl1GMx;.V~5-~J] z+4f(]CѰnBqą(5)QA5-`3Ieo::>7|k"h~Sn|}?-6;ho yzy2p^P"3u9QVp|PV5yN>_wn&Kԭ(S$r_j| 7 &+7\B٠/d҆ *I9#1({g7:7-lg܊5I}\vص.VD!$YgcF/ -I 9;B a]epxY,`ؓ\=?b&he7uOw+CzmD]yupd0ki)ѭ7WG(/76+6 ,x*|SHDf[f"78uR_}NAȴީUN6f]ǹ|#8<WTY&*+238;&ײ:? z̕g)E=6 3%0"C5Ӻml'rUwQ/wZ1yʚMCMd>!p=Z.p7C$> 7ddG|AA gQ N]y|+ǭ#0'њWՏekEMYK5-پ&"J0GFp-eEbSZ08bSUNTY݁9P;r 9/%PY^BF$;ʱ.K n,h ih4]+]ΙW~/\ Ԯ?ض>pHHKV64W{mwD|ʷ" uc߭p1B=N#G"8گJ+(l!W{ޫCG5fPO L)<94n>: (c}Szȕ@=bZG/tdx0& &xLm&G˴wڃou= B5rm&hzaD#Tȸj"nV$;TDOG+ro~"qSokPdݞ`nD^.=A'^LW{5hP5 '̀""'RfȈ—FD9'8|ɫz{Qص};} +p@1)=\Vt|2<i{EBE/*@a96;-^ H:H_e>Huj 1U_^zVX^ΘE&sc/gv4N V~r^cB+|Ml%3t[\.J᧌iN$(δaqt<`0D@Wr`!')#,⃿y{F@wjTֻenjSY;Ova *XPnns] O#!$V9ڬz(EMd4?F/wDz mj m^D\:Vӌ#ĝˑ-LH 9Țq qBM0^-c 8%\LYE>+`ɰ"OI,U@SU IN3TaB ׶ƒ{C醙to!D[Cw ޘ\`;:v4sMf;j~}Jqp][wܗbsWw7|eѴυRgs]nA9gVo{T>Wyж +m-}+lP'׻?,lVS }nJu1ܣ,3TvҾ=W|e:SՇ;q'0~\j]MI'r[zgGuDŎzR@?mRFbir1q۰'Ȳ\2rqZ9.SAAL1tkٶ N|LUZ^]igm$7r@81.pEved%$3zZR[l ƎUzXeaX 8q f嗺Z S~de?K[L2?]/2a2^Z| '?O~Japa!Htr? o6}"맔BZ_hdg~-1G??yf6[zO>gӏy|ؚ/}˫Ac6<7`/gOi78E)u =_f۴vȂcgosT# ǫ~$~M { gak ȇI5E#Fp/}~c2&lŌ G!ʘȉļ f{-_o$?kߗK^: +.n6H¢xuPŽ鏠yx̯c:5#)We_ƣ%VgCyz?n`  ِZB=ۻfi5P[fRˇnOv] Q<;cv<I\:Rhkz:ǣ7D?fw$,k?ӕb1"T}?(ή-+Oq74s?:-tTdg&Ն\` Y3 4}~0-y쥄h6ΐu8vmpHF_/۞9oT`ƈy/&&U81D( kˠ=v9e2.~u q U@bUI8fSoҍ9JZ=U$43|rE˞K uͳ{5^S\+3kGWgwtV B͉6H-_o`M))9u4-dMY#&GH%\+.Og5ZAUQe滁Ʒ[bI=5G>Js{y'mX ̝֋Gȼ3;Phl;ۈd.)p[,5x2n˚LY ҄DT!PaqqfPP{4*٩ɻ=I<4Gm=Q,y7EWDإx(&-R" iD%=X5&"w?<W)>e7٧\RtlJ0ToJ*krVE5oB t|@ RtYf`֡fѪ¬ JAn2(Kxz/ƴprOrz(ìuL޳TgNTy7ΓG;`(IqӲq=>`i  0Dq>KI݇Am>|XG=;^AZZ}ëFu27 `KЂ׳!M4dVǸVir=.F"B0, 9"hh0h>{82b R֍E)wåܬl+ _HK,_auD"z#"!Vi+4x=2{kAos{(p J{6a8K >&*|CBR(lwHZ 6?2r|eq)7}rʫ;{\apz0jhz>i~ˊczOofNq3"O? φa6HLt_F`q!CFD:?CCrY6Mn,r(+$k{~|^c3Jޒn~\ cJx>=K9"3素%SaF2<EV(a)UfTe Π˸mƼQu*M&V0nQȏ wͬ]>~z\W/__E*:#n|];LiLt4I e1S'# ۇ9o+ULWzNM휚95{ (/9e YGӥ34'4{`KxRcn)_T%\4s^wdWӛQvԸ.2.s.y,q.{S(o OonMXzc0hmHc& =F6x͆tWΝۀ_fh?'ˮ6L#{ 1R͐W=޴jI!#71"RsN0ۇ+=3GΙCh$#9pwTX8"M`G-j3sL;߶d359c9sZ$PaĀ!݁HWJTPCi̙4ms2=32-U$K!cJ,igNuZEoahPHh . 59`7bb^I2~ͧ2!BiF"zg>(ف[;NSV%y9  B.(8: a3!9?6mISכg{ !hn念0VOK,&TGfತ>2&\J[CreNMh8qx 12e &Sn\ 3Ʉ봪"N`ԯ0jBc+?W.{{7YۢFrނ9Rb5@Y i+@0ݡ3s$*0 qT',`0:WiǗB- nVQA[/J\$7k9?8f)de/wq^NFF̡s|mFHf=3ΙcT>s3 hI8rѡƇ\W4zf7o : ,I 8>Z%FkpL߁?5YOi׫ 1k\J&A`xL /AJƊF!zLV0}Vfd/XMӈ67cBJh̜Rn][=z-YI=3lyjqeÝÃE<:ZG$BzG&ԯhMɸh3stwyؤrᴎ`Di>(Z0H]aGG{{DȨeߒ5EVݫUB-χ n5`5_O5"7Y6E ]|Ti &H* ¨!,o-j,Y$ޤ)XrG Z9|1䶕 {̍F'5JXĜẢ@5VhqϜиK|uwL1g,sɸ+*H ՗Bf9k 4zf9_Y*JXaAU)d+;{ ;߁=cx֥2EKJ,|c^^G)iV^xŜ̷*̩o3CԯY) Anꏘ%r2)Wo$.~S5!'݆1%#8NIZE;#!Ԫ \gM]99Gk~}/=q9V F$*+kCq' (T1 K!^Y#?I2ϝO?O7|~..O! qHːcB( Fxtz; }n36ng {S؏zQxZ.zY96-nw`/0/kyϠ7xNt%w~DgOb2z>Ӭx~8*﫱Uɺ?ӓ;>]fL8%1z緣4urZN?R1QjtTƭ> l6H?z,m|K,g`䗢1۞EnBvjѡ@_Aه+Ws^b_٭w:!1/D4t&$&:!P⡦scaU=EgN:_RNu Q%]:q[,WNrʵa,MoT_~DϷ!(<ytK<ܧtR~/XsE7L-љ^SˮV"JGQ%pZq(a~za}XIc:4V!$NThB!VDŴ xlTeqܭI|8*,˿ Txpy >eԬJ'te^X9/Z]qfA}cthypau1BKzts#=.aŗu)v)V&G?\}^|7uy0h5bP0?V?\YbR$`Ad JB"JpITcUHzqRkE>Mjq9a )jhɨahNS`_E$Xyy0L\֣.Kgu}JbBƶ?hئIBpi뺉)*q08/rabFbFr@bhX-cdz ]{]8ttmqćt"T#B\%Q\q8뭔N(g^D4''e&(DPs"Ou"^.n'DB&sXoJq1}qHfM%vp.^ՖU&wU[vZԀ^YkDQ,-U~QBaI>fވKD}Ոz"Zv(`|<[[#6Dh>}5εS<܉ & x6! QኞJI Z$As1[/T *vkjXt%QĀ`$Jb$ #*/ E5& Q!K3ĝh$|aq,N:^ïl!An 9Ҷ\nc|OӶB 1n]h{':Η|EIҋV,] q[npLH`a 1Mu.V4r,tU%>q$"@KyS;MQaQcW@dj O}ӸPmu\{^Fe6YSp9ˌ&Ǯݝ»?8Ο]n.#³KҧgiCK!b:JSe޹)}滮G??o+NƯO~ɠz:]"fk`0΋*qq<#-[7s*wx_]`4>ijWs<]׋//}GbX0M#ő;TmBkx UGЏ;]!=%NLmˋawX@-J_}y71P&p[a(Q0=X< Lb$,0[aPs&ե9y]eiIa~K6V.<-u\܁UT#Vܪ J@[BEZjDFL' YĉIDXcw3hJSzyvo~tNϚQ5`L)0 DE3p!Y~OGU&{·|Tэcs%[3?VȎQxODˢtbi5p'WBowft}nZ v% o{x= /ze`V dv;PXayAA) H*F"a evuI}Zl*ze0B) V +1Qz8´c9aɐ BĈF#nOQcn\?."_๤^"ïFԋѲ@q);n"0& A:4%DR4~yE(vayŨcṠӞ~ 01O槻f%_qi*$VdJV@Tq%1Dy6XMV%{ph}7?mG$r F\D9oQ"EBl %.6|x4wbI&_l xbQ0B F&pgQ4 EoŖ6i4^G̭Kvgiomq{=q[ghMo=Z%PZfIj)}YxS[D[ñUƴGi >:#S&T# dc$%(a3"l]O5^ħ&)!CQ?8sׇZ uaI&y  q D4@TjF%<1qL/zf|R|C|^trQJ[.; D72 O@)]2h>rPf!BTFWm]|d*9 'Nt8,OaJOOOaҞUt Z܋@ $CX K&T̢Z~ II xjwsBrZ7ʍ/mzs܌PrbnYM F bK "fAT.0ܯ_W*0=x4`jؕ?)[aaei8ROȄsG$`!EF*PVq0QeWZ!C܇ c)L艁p<,ɒzp(qDP_+08Ơ?IdD+D5d|ZoNd ՛yͿ4dR Z&-߀+)^^i֑'"CE%|6zl+7.܋VC1x\zx(t/d_ cYRtw(N&ʲ.1%<\=l 獜J W!y$+ R Ii.U% g7Ej&t<}{'JPA`LR ]gS^ԇE lp>O+ܵO\v'?$S?'U{7l%qo ax IB@c#lE=<+sfɃMQ2}O[yV<ͮ: CQ09s^'K $aHH!QJUOD-)phn[gV>]L!xϲY ?x!F~YJ,f3$8c'@hBb1CkW /&"C:.ISyenMzDdsm7xZ ϲ,!?H"$b 8GI(Czڡ `iQ=ohٝ f$qO%Ż鞞f\)ع RwW#a)8v3-,Θ)f*V-C2C]tR-@Sz`x(,]oqր&λ#oϋLWd;J۫膋NCswY/im'g+̄ +]3y]j@oOy˱8]q@~V,NBnEK^ WPFs|ltWj ׫A%(Z=g#NP+yf, vH -k,Fȡ>ݙ[m%"v] [ur8gO,r>x&rJUC٨*NLM~*J [yF^ju$顗Opჭ8u¡>qoHKX NCI#HF`5Yxr4JÓ(O۝/n@8WO r2s_}!X?{q 1X`׈S,dlluk -Ԑ-if)%QHbUSjuչSCE]fv1@R~mpmk*j>Sdv3I#im`񅾨Y?M,?=ܬ'OJ׎RӾy>gHk>;V|^ý|!zVo@O!z]ƹ^,c]ģ؜"bEIlk'X f*ߊ1e邼u))cE~.||&P-bpүс7b~rB6 Ifi=I{{S(5/h[Pro$Z3-vYGn'j ZI\ w9ⓩXpC9p7^gp4K\n ɞUkj Ztܖo>M=wkApVAodNr_p9ܨÔ|w ۹]İZ֥13Ik$^ ޼0!s{J}I7Q<) ~tLq>Y?7!#?d5CRX= YUmSuj[rW4'W~@t/`o4P^7=<}XEmÛyx\gQNN M⧢6/Mk<_,WmRWoԼAzw@?U@`[o_\-SuXF(=~k412/>`@fYc&63p*aȈ\d c >ha!`XYjG`X!!x0LsFNnX~S1|w# =<*EypD&ֱ&ri"F !6h~c#|Ks+ 'iP8' p}r.7gĨHK VD x/Ƿ2p}Ys$w&{XQ).<'k+8Ϧq_E[BGFLZ )$3g7&PV166L F WoV52 B)|Xϱ|8$͂k<(R⣒ʣ=I`[[UlדOt>{yEϣNQV+ã28Nkʓc>YMx d)Jh`(&J$2Pt[[w)mkH}@aocd_ xjk Wn_2DU:8L&2D Q)*{TBqz=j6Ƕ|@2 pV~-crfmgtPV5y&>|kRrL`9Aw3,HAwX+{ƠgY2Y&p0,lLBPq8 i1b8l%~FIeŠ22kSA󅉡!IdqH k &Jl N֍;^{xTGc㲂C@z=?GEyf!1,q\p-j!ml0P@B;Uw$^)_{wC@#>),VdEe°}!u`sʋy}ay;JivJKRI:af0EFb0NH )˹8|Yx'DJ'R/OHJ <0܉O^  ۴Тp6`|eH)za$h,Ab )c !MiYMЬއ kq4t@Z\ 9JN0҂_U25<,zTa M4 K˼+y i%}YF$+;Ѹ H3Mkd͵͕oz-ӹG]pTz9ȲV:O!| KVI%Ha)jmu<*SA-z~8G8e0)>XvI0ǎzGep8ZKR-)MJuD #1 s}ogJ!gFMpFsejzpZLULy7η=vJNuXkWR[s]"A _:p.@$04 P@rK9Qbj&-\eF?Iҫrb-dia'@uYfޙD=3Û+ñv U׃vdzt l! ǦdeV׈VF@ᣴXF`ۂӬh"l+#Β6`;)0av̉שVƠ|؄$Z !" מ 4:Ua@usx^T3\Z^$ GMu9xkrcR&Gd#BP %1#G,հ2VmvxTxÔE2 ۲pVG=;(xӬ(5XA讱UMۣ,k1jPMĈԩϜR0VaQ!?.|fjUe(=HȚ$<"xa#/<}fg1dn'}`8+Rd] RX#\H܀88JO7 RXUFK%N&˰0L %C|I](ō:Ok"0z5r`Xmu<*Cʷ)ϯ{KR nHcޕ Z ^QHǀeTz{Dǣ28&,gӕO۷6+M2$?].@}k@6ʈve*Lb/c5L6QYsBzgatyǪ2Z(\څ<'Ĺ ڕh԰9bQQEA ΋ƀ6ScUѝ]__Ig=<*S{/?̝U((pğ[5gmй38hb1q[HF \\3a,Ѡ{BK7Ufcr53>AӏR뱇Gիx3{Qx:-ʏTD1 QudR\^n]>YHWFΛ3Uqad֋eg=F7 <vaгYy9$,*kPPL(=`UK<+0h1AɊ ,`o>(ˤHr#|G3!6"xd CƧB`6 ]0 -?kѺO jFQ)0wŏܦ"LTE9wis 7=_fm{EeB6^%&wIg;:mtgF]}K"muo޾7G*1HkMGӘ 0-EzFhխ};KZHm7YW \ow֛@p%8V We=5\ LfqR\t";XAoQ?)VsN5l'䌗G}K$M(Ոxg:MI5<Y#iƗtյn>~h9w?|A/Ws`@|h Ü@Moz…ijƓg6VOͽO;Ab=uw&q1\9j75siAx qmIը!CX$4Ayo Xɉ$Cҧ,%6/~qJ$ ~OW 5Vz_pB}j"6ss2&+ǗS[}%6\"tTu§,}HeZ|7ֿ/D9Y#fIzb|{yYs0+2yyߎ:H/c80r۫E%(Gaqh-8Do[x[$! e*|&Y # GDvJoz$RK,CbWcw]?n"ޜG臫6W+.ۼ+qW,uWߌ4n |3q W݃G+:3k'Wp!bD)K ?l׃d&gc,#Q6Z0 xZn[47m^&Aox8 l(勔Vd1Z&HiՙiF ;n&p*3Y|M؜T&}F ߫F֯TH;u>1U +Q[nI򣜾eC/%[hZ& cݭuAQ> yFPM67wy.YS%9TIT4CV&Tζ|ޛr6'K_CN ZNZm%?26v.ۺ.0g٢3>sYl`34wL.m?yy6'W%`NremƭI Ms=XIyWꢿ/j?];?_ ~Gi mǰN+._s .Y{^Sꚾt|~Y>Enn_ލ/]+>߄/,{t{VwEq~t5'rv [ |m1ϳï| Ec] 9iavd楎P_>,~ ?b89 يSm1E׆l8N;mesx.fMks_ +u1JYC_ԇ:ߠLM]ե\\$FյY[ޭgC݀>]+/4g|tcҞ Ft1­lUNCl{wm8%c>-.Z>> f4W*|fqP9sA--`dYqH6v@4lgCx=KVslH <8rEelojeuKdv}uD_khiuh. d^'M<[}I !&9 i%N>7gx;-=Ggx!5H1.Lʻ/hxNih&}ıJ6Lj<1Cӏ5U^;dT:\¦+ٳMmڼj_i9 V{y&cgcB]"ߓ-!x6Y!Ó~pQ|O]Wnf16kz=wNE1q_> -8r|F~8jى]vtsu D)R݁(7JʕW$.s彩\(Rip!tn ە#6?[B/%WlyTI.:b)Bd &; @4g`Kn"'JQ@Xi t?6f1|-9TP8M(rneZGEdQ0A[R.k%%RHD#g*3hNWMæ6f ha{3f^N,}1s)8[¥;Hyd: ^nEȨ (O6 |nv (z|W;v?D_.D1eqc*h%{gqp#(,k/h|td#ڂz`vs >>!mv#ibNR7j5B]b".1^|jC'{~PmOӵҵ-4m𤬂,~ꃵ]`Sqx5BG 3%pnF:2J-Wx:Xk+Xa:VtL͢ `חga} Ap]HYqP띱V5 D佖豉hj4BZ"569>My{v jBͳ1‰hzk+M'thCK{KZ].iR_ȸL'>hyƗgJn~y(&mloyw@s_GתٲԨ遆758|VFmع٭M=;Q,3]d:*OǮbqKz[7>\> Fnu6C4orŤ]5ڏ k XMqôGM2bOqfI\L?dt~병6՘SIƔc˥#z!Uxd!R&R/5eDDL ` Xy$RD;K=l׋ 쮬S3KI'1N<< (|\͇Z߱h;޴lY7޼JE*L~Tf@n+NӥLJvTծ- Ԫ­E1xƷ*VfTu=}*enRY'"@'=z*8qpsKʛAQ~tQ⤃/O\lFYW]ʽ=SI] lsgW;w? ClsY04\`D$| mlPiBk=9rgƯ(a)`]q.X+Zĸ3[?݌|^!gs UMЦ>~XB1x54`+60XlS^^E9\F[X4c5!8-mѝ\>WiX\RNehτء3lWd&&dd@l< 1`Rume,:u~㿈3S<`s-: `oFf|:9&&VQ`zƌƁZ1B1Tk%i}${ĝy=A'=C22|~uO+RT+Փ-<|bt0/Cɳy;c5Όy Z`?>Dže&xU}!2u97xT4xz?89:9Ơ_>-Hr3Q+O%9favLnڹt|Kr_ gW%xWa1P0`h'%ۅ%G7g/L^5ؘs`ݯk0+ť g_Wwl,jy7+=ffd? EyPhg^յ-/3~HMe?Y:~C/냎G_GINBPmnRtU?֨(&}-!4|[8.sI)tMT9Al.p0VDM{f l{!s$t!Tpc .p|4=:ĉO #6i  <Z?1C}9<ۓ3D Cr 2 '!lf`YxQzfrʚ3UfX 45AR鄡67|J6(>kƗh E˥Ý7۰T*d(ae7QD`X}K_M8X?)rf5ΊǗsyFJ*p}whe}(CB"<^G}dͩ~w]CR5jxTs`.@Ę$S{+%R 9ntc ;pG7U-\z]-P2z-܀+)&[)Q[M*Ffd j0DNt8^j"M7 7[9p lۀPE# |4 j-$* %^F@HHLBDN.lV@ہ˾! \HLg@YGe9AQN% =rl4 {buױ7cZadPg($^2"YobA% 9Ij$f.Uwc I^zM5픚vv8staFUg{ LrjV}Q?ƺg(y}Ё:N[t0_. \fE?zd{,elR źnX/VQ[kXxRP-6Řz™qio'-ެ(Ω_/Vӭ7"lJj ̲٪8+{;He XwTy?aQ,_FcTغr?c6ɟ=|iRĎ-i^5or`vRyPn#HJ W1Fpj&([*< epP:j=FA>x%}[N`@߭IV^3ǒ\f;4kTfYm8%^*c8ek\+.Hm~˰ ym%b(E#m.8OI[&x̢D^_al `01Dd',pbvQZ`ZSXANUd&Ys-BF0Fyy1y֠ ZBea 3r'4ZGb(=a9RkGp@13ӌ< .c/A!l=wfGI%a#i"C , o$HuScT2:aR@w bR B0p=r\#͞h0 ]FJ*0qBOX܂peډ@_)&Fp*% 62Aq&D9;gFj`q2X.ӘG"hAv 62(:n?oׄS2Mj-9wI,ցU"H0y$K@ | \KL@ȳFbASsT1QAф*Tz05ŴJR".N9\@MGxoq 2hhi3X.e҂oA0@Q:)zsPRl'hhF$ s@y \X+e*u]%8z &bA ک5bBc'KpJl{v(j%$6n=E4Tb1Ceh X888h d p] R"(fW68R H7ge <':1LTT[DOvi1agzKg=@@@-W`!0]KH0Y@mڎBGEdQ0A[JH,ĭ  h#`GD lAJ.00db L:H gy! D Q3JIw"x&#_;WT` [$딳$Rjw;6#!oHzi. AE^B?3Mʤ,i7~!)␔9I^aOOuտz~(pU@+D0 oEVuiWԲRW"jR(),@8@髷5G.0\׈&"kE~mcs1F LkjAvtѮf1.y5!8Ih1z\K |ssVVs \+UH"V=V~eۼ ڦ^ + 7=x6puPK~J;a&+oSޤz S*d`,6\tLBc ALk$_j" c҃?@I&bZR"b!}`0%ҹhX?P{2 BkL_V^J u+cx 2h:]EVWd!2T?y";*֝Y~fMv,Aۂ:~BXwq2fumClcY`A ]{ #@xuY>8 !/s 7Dutyr (!  D(A,Cj(mthvs^cY XAC z5lהkYZ}$<1 nTfsa6: <kC=A#uƴ(ʳVo]%葜 !54f;yfc`(37¥ʖ=nXH!/3 $ɻ͑ErtadvٹF]A%\YcˈH R* !UJD5?PPʍ; J276Q+$OOw Ɗ"DLRs, Mַn:n+F.l %-1)UJ#*Ɉ z 3&8#d*҆}t<2МY@HrWJ<ȱzP.W<5J>ڪ|M43ju( UY!eD~iǗo:i M0 18gib!(j7Ka-+Zhjċ#у<M ǍpY2 UCZmY|*4.  lcpjBAH| ):(`RX-,hB~%0] Az55j=&U.Xp͕eP"WCՖz̩{1߾5fDzx~ чFߘ}aK{ėVgG{ll?fnJP c$层>\IߔK%IEovݓ9=arA'f;xAK.FQq,UZ^<.DKψ_aZUak[{`gWљ+aR A l4eքV*,E,u:. ѻ6{mKQȾv._owЌ&mE.:qpsak&}iF/ Wo̸ .urbn?7E3 /}A B_ /}A B_ /}A B_ /}A B_ /}A B_ /}A B_,B`7:[_e>#:W0:Ε&"^qۄ+2+X詒UjsE"֔y I.*( ::W+9iFp;W:W#쉜Eh$zp^0Z%NQjARʕࠗe]m 5wYɩ<4R\3(@B*5òk"FGٷ|5ciO_C 1yWR1WrL!j^rh=>&~I]h_ˏ#a'uYHKOwρ5gj3NS:]7yҪҒuTB;[ʓt;!`BHݫA7uT~Ha)jDmePKGg ng(%T8Pā"qE(@8Pā"qE(@8Pā"qE(@8Pā"qE(@8Pā"qE(@8Pā"qE@)m>'2܏·!㴾8D8*^(]:No/oLJq$lEfߴV?`I^,n XAtcH7ܼ;sZo+wys,sz{ݶ@syV6xo6&2ګ JD/9Di٢AC&B~٪{dZB:s(w IC :2H͖1wʺE4>M'yܤz+QJJS 98ҽEޕ2RoFfӐW%Fh<3 F+]l|,m6M7 *=-OS|~v6_|Kn/t?fi4sprw0F8=@LBq0Xcg]Ӄ9{zl ׶Urщ*cV_7 ƎS1?sk%^Foi;$916zr& Chy!dCPUuc]eJUAUd c)$RVSR.y䡛{m9'C_fW%*U$3u&RǨjQ$bL\*/ȍ))YS~Sm0" QmjCTՆ6D! QmjCTՆ6D! QmjCTՆ6D! QmjCTՆ6D! QmjCTՆ6D! QmjCTd6/+ߑ%|?/ˤT5վ/_qciޕ_ ZǴ89Axc^}>ќ }I89Oahͮ3jkN,7`,7Je/T_uVr2*(T(Rq])B)TH_ {):' c訮cCyD^ 2ۗ_l?7A|͐(DBz5e9R^7}$֭pn86C] ±6$vQxj jݜhtN}m824pB#Y=^DO=kωY{r1ߕѱ[?ݝ?u9G; cS@h/GݧtN{N:f76f< g=!d8-g~:xg^;r=NFL=]!6|l?[9#Į:sU6}]]j8=?=AIÍ#P0n\@H='JdObA_+E>+V׷Oy!a5k3іgrTWQjc]oEGm_}R [=~iH^$jB.B2I?]5wӮ 1I޳Sqͻ9VGl[%|hz?Yg[OqBYR`)XL !7:Ү ݆M-Dl~k+r{^~ܜTYO5>xz}]B]/^,jban$p1/,ӗx;_$)z~}6ݕwڍ [0[;8j mS޳waozxzS\0k \7C]7%\ evfsxsnnڥ#?SQٲ]|\F.=J;K]V斕I [5Оuu-w^OC&^c҇F!N :l\֐>ȁ>A/O7MNml#ޮ+LwO&V7S3޼cG\W4RY8(\LANce¬yg&Ex +ւuת-ch3xuW@935'ox8`oXo_>OxV-wa@| ڳt.$݅2ߵBk?bFKїw==GM.:߱DV]Ù20aΧR5եoK.jHO}MJw)GΡrx_{yj=>&~~cэ]Jg9ӟרoܟJA^S@{o2~&Oږ" %/d|/d6߷U4 Kyj`<JkMuGm d$ݶ !Íԛ +\n9<+?Ǡ}3J*Bk%, og0H} hӈW q%r5R9GMC;s]{9;/z~OۇWm[Sm{2ܷZl{']U8=k&[Ifѳk_Md62&feU~ѧR$=2k˅;:/N2>.Eڵ T?+~'}l-W(L{Vwn+9m^}\|, uМ:}eU*m>\|MSy7- ;YĐ3ߐ/JZSO1!W C-ZRΩb/O2"3v6Ć]ioI+a?tc\Rcvn3=mӽFImM}#x)R,(Q:###^(eL[a@TZlصr+ND+cݷb{ֺ`+gz@2.87n ,{nONmυʹm8Bl%ZT487,c˰BjA߱0b0j%0 נ(Wׯ^V+w^ᾕU00ʒ D ֗ęYQJM38s8hpK 1:A.[K3VTR\-f [ CMVL!tѮ`G>~HQ=7=:Ӛ`QEUmkZl\!z^Pǫ #A22pJuZ\梦%Q1L>֢)ym\yOUTދ>"#4yO]ὲw ĐCz彠^s/sLs9d(X8  7E*}y m"tST}]:(XN G,!X.FUp4w!iAͭZYGۗs"%Uh_ZVN@Jā=A&ÅwA?5Q kp]?'O ͵K ])Dy[ %FV YioGg.CE5r㳬g4qFzf̺Ta_tEE!58MlZK4*5AR鄡6e?ğuG} o?A@y,; {&e١?*gPIyMQ"_h:hct0-$Iy^,{%%?)rL)(ԧ+.¯~&b[]u7Ymw*Ye7UjMswь-[e1.\܇bB_ q$-]|Gw- ~b)wl;2 c?#I\"ed2IKCLR2u|#JQ"v׎&cmGLLpBX#cTʃWioG.5y o M9jm=vj'Ky)EԒ)E 0 Xj{q,9L0+'yRwޘo~Cߩ݀]m Xw[;~w밥@/1ss޹4XR\`0ӯ}HW,J .N+ߺikK5jpfVZ!5twk=b+ٕu<\;e] F(_ޛ.4ѯ2$g-wYe.Vj۳{7#ܭqo0]/pշks77ygATm`IQ0MQ8P\ ׅ%0ަl]6 BE="2%NL%kO}JR' &$oIխ3\N\n'in'e}Wppu׮Rc.@`E=tJR2y'WD!H};WpĽ$%?WTz5ćW xo*+ԾUV?x)RzpŔ\ Uw|;Jz$=\%)=+׆ S߿껿?ƹXŅq@}EOVOWįGi#L0O#Y]2=AȌF>}[YYNyi&H"M%{#ܹ`r ,z Pc i,U*{J 2h)ʧʸQGaTPcp( a@iU\04Je1*U$XsJU\V"($ SļsQ{,PC{FBRCKKкJmR)J΄!Gqjs`Te#`:Pe,iڣQi0@@jISKBCpĢhZR9!nRJ 8D-c $&QJ54Xb8Fj'm8 8@V_|Z@Oi|dtqy>& nN/ϖH[%Hs*_QS' qb|Q'qޗy$|IJuXD$QYkU``Co&aza7tEqa.76}oGU01iE3_`}h"R=??:O@'s[]zsϿ6ʺj/4g^*սtɢ" ?[+%z=0ۿ ܰ̎Q P!\􀙛D:X-zg2b=6M d B͟' \ԉIJ?mJW3gW7-?C./$Ix[tq`wD<[ՂjgIOrMfaftLhduqn&6?9_Z&˙Vh|9Z?~A*NR5n[6yc;6}5j8!YʕbXXH.xck664:E5K=;-OJ~09c?ׄh-J.R>RLSNq飶ěU`̙c j]sR&}PzyMnF7?v|5vfW|%U!<`@hd aD%$\B:`^e.DV#ȅ4*Tu4Hp&$ZpB8`#6HckC,;#10BB2 3uS/7^H[XA`Ir,aKUܺ O;xvi{a;p|gA"4 fU{1cnv3[Io~;۵]P%w/x՟y5^ (c|o6-|cvC]ً#zkxЇd8`"{Z2zs{8=RӀYū"Us8gW|Hꇊf\3hM-;WaR|'E^lnBDO:KJb#}Qvjfm|/q?L 0v&%/zzY7M9DŽRnȹMȰ, Ƹ#aӺT'RHD#^^Sy?;'q 908OiqbgAL0L2Ѝ'RI5]-XZk٨>`# ]*6ƕcA/GE #z!UFbe}0z)#"b1h#2&"يʅzTtF/\䣇7bﵲ2aDD&p ҩ'7V"FMsv廰H-h Q7C sO) Klf^}Aw3^SrcW}?үw"S5ٿm&o-v>Dzn9&-ݳ$eJ}xqZAt\:񢊹I׆+SV{5wOTO]io#+l śl`؝ `` dneI-#_$KDIVQ,{@z2B>y3{SoIOW쎚5:RXMA z'(f";fsU0zf<{1'~;@Ƃ‚v]X8,q % (܃qf\|F!MG) G $!P^HXђ2ri~0D2 ̀p/ ܠ7p$͑2zuk-5:UiTTvEÀ%TqjRW8R:%'zt5[w"_vzve?dN>~X3$[UVvl&-[{@|~N o5{wvH u􀫃`hNgçzϧ]Nv|i@=.>Umjws/A:wq5raZh +DXJ %cfA;9G0T*1'~ފ&qzvuOu Ml9 ;m*n"] cF{@iwnZL 3c %JAlT-5BZ~=+_7g.>n@Iy4]c[fu!`,Vobm3ɩ2 6 ʁtN,|?5vG\<ڧK733߽vޅO4X [(0OI4p:x .PXp0Fwa sLrXz"TBC 'ajԖ(B6`(&)*e'ɭV-'S[4?Dƞu{:TW!'1ٚzLi?+'Si欦$)~"}-KYTpvE€e+k5ѵ%d0x t'c5<8RAנه_%XҬkSkeE}djh%TL‰Bs%nͯGK 2@y΢TUaOyoPkQZ9 q SXC{7ap 'Iv (nǟmF+].AMo펛}I*8aQ{{W{\ed?E,wTFq0b{7ntԔJ]φE$Rq`?J:GE%ɝ#mî+[FL |0>JQ_ZPY^ Ã׿4x|@ ~ck.Eio4zXϓ/LSJԚ{ Ȇ5aqlrJۢeDD/Aَ"8`G̾IC6*z: l3WΏa&OL %^"A8Sj?އům<8txI5 Qy*]elznݞ7+nYqK#`U=>RbR7^Bc![1+!J.n_>7$0rrӸup[ {g$pBb+  N1u#~ߔ8i ![wO1*ysPuS&ʌhrw7R33u1 vP陼|FCQnP62cSniwAH%xZ"D97isȬA'JUhgR:8MwE2Vk~pm zB%\σH/1B `twpkc ќTRM#LU`*tSiK/Lmm(xqd޺=Bet;\,WN$)mWzݰ?~2D*ַ*41~%pRPP@3Bcw=C^H;nCg-HafC0 " S.E` CTBk!wa%7eHV )BPAߌҲBXl gVҀCJ @]x8="7{ڲZNʾ"@'xS fH bÝVSc#%{n8_ŗd>ߩ3::bs~2tdn8 m|83.l>##Cb6Iv E5!@pa# I"qhɍmYWs~P &r 'Cc_])ѩ<ԗ:6jj58թJ3#R=()0L} :w.~6RQ0Rbx$SdS'48};:UXqĭP 5Ia Ne> [g! ja$sq'pRx \Y\kQB{՜m[3czj}J6=F8@;$V[HL1J@;""4gB3dR62 b*H c}@D 'QBIA;9G0T*1'~ފ&qzvuw`ĆvaWdBU-m'!{5+9{xFY}ˊN*z⨽#L{g>F{5#\`5}4Pyk!x s JdZEnX8.c=~2 2@e@@p( `-i"Z[7q,ۮ^36T5Vjs{Ɋs)q-$gꪭfn @ t#m}/v)'ʑ8YҗIJh(:WRXgEGqJ]g/_9)"ڧEDLJ|zZGAs3Y䗁{'c` 7o»o&?>lٷϳǙ;nq}ǟ<6njRǂ#c#ohgp$=ҏ(=Pn>V~07Uq ލB;U^8gۮ=2VT\5QvyO%'V8KcϣǻyAQ|l ~叵i F(ҪAEC6̡>כ@m=q&URl%@B"B)F"S_B#lll6 ?bA_1x^ %@zs)mWr\R_u^WZ52LKe| J zջR9̅/K9s gf̌ym:s0ce83S2Ov.z}hT6<]7zExsYʽYum(БriH/܃> x +:?;fӉPwLjjuSv{\os6Й!nx!cV{jc޺1D.zo/jq²NZRq)IF_wTe"(joNǼN<}:xdg#@^,gWi gT:FqESq4@AMAěN&x.<~ k-GH)e= m.%yaZfޑ7y޶˪k&r.nIBŠKFX:jT;MDJHo]o}r.WOؚ'H"H C,ktHasFa] ј~J,8@B4?53&LXDt*LFaJ!˜*rX{Pz'B * P !GZq QQG¨MBrs:F^ *^_vޅa@ˠ4v[Q`&]7z5 @@acbXA].`"XX` Rcr !rxn(YIA]0/-fDIBvNۤ+$- 8SP%4t01@$P#LEզL$E Ɇ'oha |}~ڏalzz!rk<7n(˅xp`S6xBxJNqZu`}O)y*J? gaj)Aa>+22xʠB*d +Zs9F'bmZy 򅵒{k%T3K S g^J6:(.WݨY,:x 7|yDTJeMFψk5Z z2]}J[to$ ;VEс@hC;0wA:a[,jڶZg=lzn즇1}wqlh~2~=t\nZ385^7i6oL 37F$~jt(,l_6/9 mOgrw`0 oNR-c܇ZA&٭ڣg<FrlNhiP13b|TY)$a49.It]L<MR8oϭʫmst b^.p!Ä: ?{׍JCO I>$LAyWGkEHؙ-vdɲtݔϋ>ݪSɺŪ&k$՚m-Hmպ3sޭdW"-u u˅ҫ?y_2ΐ^voP|CPQI->b *OB5 ٔs r\Muu] C={/Oē,׍^iAb sT844kYr&YE;!fge&o" /7zuO@CEw9y2Iw-KbWљlN&^]Q5I\ `*PTđ(cІ BZYn)ts=IǧۤO{(},z\f^]ks~qFz9+}?:[e^ 8wl~'yx˛&Cq(~rN {gW?:;sVwc?~d od7.=⚋\9c YX, s-gk;uKgſgwlɃW$*+|wSBo۟^oƶayvDw=竟6bm\WϷV/oW,F>fYVMJ~AboVW9d;^>[m8otƲYh}`3>%hD$jBj(SQ8'l"b-Qr:#maJ2WLzڿ衇h$/Kd>b~h>1м]"+ o ױ>ɗۖ{wnEzNOC$[Өjq1,UġZ(\C:XO =!]:p2,BBNOxY_ww`GÇտlէmkܵox![Gf:PjZ Pem,Fiew˙B XX|c+"f\fvur1sIvN\A<^C76f-x,>›oݍpU޳MLP )oD bPƊEɑL׿6ok;·wjACLTW,|aRlHd B?^perJJ2oeYӢ\Lїbi #hW6U!|C:UϵĢ VL Q^tYej di # xpuV, Ս?h7IHA JL0~_1͂!u0v Caxy΃[tA"LVT.yϡ0mB>1%BSF[h)FgmGR3=mFWi|ʓE]SE ~!oNA،D/^nj$HL #M T.-Py@e 'Ϡ07aXdޒ&zwOQ\=]s/a\6~գW!<:YbM9)3( g'#g-GHD{2QKPtУe>18d͉nʛBwQB0Cax4|.&`U1Vk(M"̡0%;|gV58=69R%2u(;`aOB&{rn518iw Pi^wjFJ;U*J+-sB?͡0["ؐ9xwNCohԗ/翜^F>'hx!:|>;|]$P&u;%F |? OqEߺ,2`!(%=Oҫ+o-Ȕ_6R~'gL J6lm뙷{IV 56Mu!y%r0 ,1,}.IdHTϡȰ-dh! fTNWcw."qI^oDRvBi0[jKΥ^BZ U(@etncbtv;yCXQUksP䒮 T R1TV/ie1 m酶B[z-Ж^hK/ m酶Bۋ^h ڑۣ a؛9 ڽ9"<à%0+< e5he;oıl|d#|n}sa}N={c öA(H{%gzI_&/XO녑m6&R7HxI*%eRSd2*+"##β$tLBрLIa"?j=?na5 "Oi.фfS?,5<ܥLj=*on?>} G?{6| +n[)=hna){OZi|)ew}d%,\tcw_Eez!Tr_^ ɗD16օBl@WoGe].k´7cTiR?FUS(7^'@_}߼~N`g 6ڽ/&:%ja6 xKgՙ!g.{g+j-$w/VA(Tӝ h_JRd=2,(3,EXQz>9$/DmTu672&)@'’j˩U)lCوtW1b!G\Ha 0eZSv ˅9,H˾I[H \0g5a83$wXMHLG.U2Eo\yrڸas{f*6p`my+jrkO9\j~JY0'Le=.E38bHc4.m95~r]gTuroӯşq;I7o'i''aڟ^C7݆2Fd@ i"{.xhBjnϠ%FlrHmڰ6-6xK Ri BM JS5(APP1HZPI;׆=q_V2d4ǕI`6+Ds$}0TRL# 8-fJP2# P%JZSJZ%B/}YCFE:NX&&\Y5B0R%A-4"_snFڇXD!_cb(<IjDC)2/L'*[$ وvwNXࣱ&' _omTDhg:&mEK % l܀:ۢJw|๴G[-ssBWCGn L_!Wf% HQKԢd8A#Z 4lMH/AlbhFpdhV8키 ;㿲[iTTѵ.mwjW-N/9q֟387(fԔ508+2+[Twx;5Kfx23!Ci@CDАh;j-d)2n$Q´;Ym FE.p8D ӻ٘wnGnM:YJ#fUz%cȋY'p"J\qmpnW"V%8X 12, 8 8=ynrrLke@RQre ng9Ӽ×|U%n)fYjY^Ȳ\(ۺe٨bTt}8VA+=O9qqfXM&d5!ՄFHZh⪓Rh%9TegOV͎L)(!CPl`X\謏@Fe4E%r5r79G `r?뇥Vo8af5'^dwnMIu[a`:Y&.@Lū«B)IO\x6rka":M D %p9Qaj^)L͔Lpԁ5\j85@XgFfF:NVr<[r^3[z 6{`.|\XI:2rퟺL~D2'v'h/lq9>o{~j6H#qv>y9&|G̼2r??`%| U qM<xk_F7Og[og/;8sV#HHArlsj[3-r "P嵷-T-ra\-53uW~>> @hi -89jOC 6U0p.S];WǢ:kNq*~B􇷷eHl)iE]o&B@_Eqē壿<3y^`~nL¸7.">lE k7֡]Bp}UQ`VtY*7W)7LC*:eYZ02\( (%i$ )|M|.O~8,ޖvt/#GvST/sS3'V$*# E2`V`nXUBw V3 1Te /oL;Xݫmlz֖6l.'ݶ Wdz>$MiVE"$뤴5YpjZUkkTR VlV{(%r.0R 4GE9;t :{jb>I $Q[|~*R-v 2աF`%:;Pk7(TZSaFC :\CEWUkWWJ.z3JJpyu, 瑫e2YTץ3ԕꥏ @tH]!@UgU!CWU̵BU]GuȚ=a9po/Oft|nkr-&?r3.4')?"ԮXɘnd>=?ܬ~AfH5,Yƍ(Yh'Yqix \ "hx6ϐߞPr7Wysb%qNpn&z$>䥿~4a1>N2v8ÅOOI/0y/_Ѥ<}68Li H2hOysX?f}PV!kIfPcJ$XRQſ;; ^;(TJVq;j*; ;ꪐkdWRkիBjrUWF]wI]uwH.xu :JU]Cu%g͘yBQW\˺d kWW U]uۄk Qϳ 7_~Z)Ǵ(8֊׏Ye5(^9|%b Eƚ?'̡ erցf#xe 6dlb}&_~_{~:%}/1;N.ogRD=q6U0p.SmfR\BaiJ$C0d)X/[_`|'eт W-+!֎Zj)x P*M7/i}vf)-(@Q dE=hI#Z pOsH= [TV?X ~PluٱVHu)H%t,L䴐klW"H:u#Jaj=FNBn~vy^j&߸*߾؂8sC $@Mi84"ˬj/Z4M֢%3cpU9$Eq9(y!Nʢp3`epd3 ՚1TkY[#goY/^h4M5[gV[ʚeHl_ž&);Ӻgy>O )<3(%QBG yT=w&㩚h^N^FIp/YĂĹRNY">H K&l^c &#kyry%ye#FB,c@̦^89' [#g7D:dBv 6E쏛yWKSnC6hw45w d訉sGFiRQ . * &ZP!ig=wxtInk[]$s6P M$ 2Ѐbg1p>rma)q,@Оd0Z[U,I#I@< xe9krV՟$fΖ.1*>~ Yj"!!H,8EPϭE5 yDž:R|6EKnW4_BDaI (n#aFh#wN;%#zʝ+VkP+LI =~?O](R4वTJI G <˿ 5ZYS( LoT՛R'͓MYuJ,^&u_ɾ nLL6 Y!0[ G=P Tv^DiC/=9kE>ղ5(H@K!t$ d`N<+T5W:tҡ $?6)&żԩ,KkSRJ O.䉒MT)#]-k#{%MB8O0T]N0<i{yugۻ;϶ogZr}0?ƶCZO$bz9lR 1 V҉b_xYAU ^tD_Zg ژsQ\J8.p $υlT%20^&<Flt1 cl{+n)B9AF| - &De"7=SRܨ+2d:=/Mp~nFvH28CXd$$Cw] ȉ Q`Nj= RH:H8U?&F=QFBD}1)L#h.jlJ3.> 'dSV2Qqk4(>/dBB+YKZy}5§lY۾(.2P.E)WALW j3*!ܰJz73ak, ֶ_xε}FZJ2G #(MREb*ΫΫ>Q:<:̈\%u` ._]`E}o$1Z(OJ9זiXXs_BQ!:@M5~NuEݔ&[ :\v2Z۝yeq]ϭpGӲ,j|XO7%r+i(k]^?B.i.u9ܺ=.pE:f!,e1n[{=/|4yv~ȵ1n}-܄m7;Dܳx^qYL۫krLbMclpQcbQtJ+[J|P5J)Fԛy_.OS?$%2Sn4 . "+ĵ0i^Ixz{C}ot9q.$R[v7{a OT} o~g> Մ fUS-Bڥ$}jSh*0+NnNRnАH1Yj֜I%Δt Ss8KNPXr |' 9y8,\6.rw{+E [SYLjf>mtgW\n f'O'ku'5ruĂ # 20=^kjXjdSԶas&^oNdKֶ ezhs}YGNI˭;M_^z㥅̲me(F07[;en973`♠y#4^RjQ[Eg&6ϘgLfl2 qnU]G N;acJ{l0D %:ybr%&)]4ouTos8dTuI'ׯy 9~uj4RN y)?->ݷzI (SpBQ4]!WʡϬD0Q̓/p"qAr"XFs" 8LZo[y*üPk<7W&ӟx6Ն ˃6yKȳsogN'8iB&(l'EGe>2cYmxᙯZ쮝}޹-eo*GÀ 08TZtqf*<2~l4W5J x~ACt[}x1Qrs&hgm7c%Iq~g9eQ< U0惽3IJ} qkYXaI:4A9K~uzi:~2RI\R]s@hTBb /-Sv\-\,Q^hŬ]a"!mq?n2Zu|1zW(m܎7 _ގ^ }wx !(uzoE)ÿL{gq׽=~0O%f^24*LY%BDrז?@NGQɬ]K<μQDGN>E7}4[9X1UdKw5+ )}:\ KmOL^zޱg0cXAaol$ׄ&r.,^%#|Z0^ԛ O0ȕw_!긗noפKYI{UF(PAdV.-p>!fNt{:d_Mג0D6 8rXga&UeU{[J?gV(ƶa'c(jPMx>yf?PQ;,Ql VJ4^1f VDZe~hoTXAFԦ8PΓi-ZPZAikgZ-:Z1'&%^Ȯj5VϺ菐J VlRh=Q9l+• $Z|G|[]MFսaNϴ*]Arn->SsEuk%YZk1PP?5rQ$uF_X.-\Nu?HЯ{+LI?hZq'q/zCl&?uuзMA貉?(hU Lߓ Bnr#Or42_u`).hV4b0x‰dMFt (.hv{'ACd|_6 ;| }ڄ%jL愢,(#e:@D:"F MFXL@tRij%!t$?45$db^ 铲XoH\;-)&p !R034x% ROr#௅+G~="TgYA2?o8 П*;oGyy:$޷i.|ʍ8L]WsfA2s${7lj[s's,5aPN59ЊsÙLn&STWg0YQ@Jz!<;˝ rx 8c#XG.B(cuwJU2_%\_OW?*fT JI; MU'ϸV ڊK4!e*|JCΌ{9̈ΞX>xs3vm F1d+?Lle]JmMVËnX ԍUHBnnV?dì{Qp*fK>n,_&'뗑Vnu:iEQ<$])= zJZ?{ȍ_.N|?pIlpIw4"8A+vKlі([{{du7HWd@Z)C!"tJTcSJrbGg_e *??}[?]c=lb#4 F ݧqk݇7릾Li_ nd] ]+dص<Ж8To~IVyt@,Vْm6l9Z ޳@=E š>^h\leTįЎZ2&sG*O)ꄀ؝΂o]-LXlQz~2/Y߷eg('xpJ&fZ>罥)K$FEN]rgW3U?\ϣF:FQO}@dM&:NϥDlTBu|  .ݥAe9*dd8~4XGьKn&&ޥ(Fvxg9Xv.ɮTL7u0-OȓӺmrkmغdk<~* >j5J*08 *X'eU64J#HDqǽ3)%@*S{11qP@0H,:&%4DE(Jp TiXq3Jy]X3vՅ..`(PTMLh4!i57uI!%%B[B;Bu k*ouJfcYj+=WEoxRL. 'A\"RL)*C!3^* r"ȤJőV0@dNDqR,/iNP}LڐTSQ6=72Υ :):ik*@=ƘVT:}1rCI BTd0f뢾T4MO>?Dnb>Ԍۚ>LɃKc2L q؃/YDh=&|{onU¡L㛽F`!h]a?r}oVR;ݏJt\(5ep,!zM5er85:{ຳ73_ԑE}\* z𭿷Y bX#aq.dh~o~WήP'j~׸vx0i[|A _ͫ,x 2rcfx5^ZW'f?cQPwd^+:TVT,H7Q&JUP\SKW;}(`_{\=Q+ø<՝,ZiiʜΆxK򎓓nvc{.:DL)e@p+Dpv\ajſzurFnD^k`ֹl+ S@]Q5viM9M\=[tWVqYs?vkEޮoD톅ήW>sU-g4 UH'sct&M?f~Ǘ0yi_ # 罿ozŘs8U-3F0y4GKm5zggrV[~BϘ)/!;#XyX0cXh-$wYv2z jmY98o꯼| y/9Y%_Ǘ! hD!AEPʕe@Ȩn[dFFwVI6K^ cC 7'Ɩ]r<-S{hn?-SiMbL˩7D[ųGQ#W>sy㨴ԕy2u֌.^!o*ޛA[p\.b}Y! >3K[5ՙ{ӏ'k[/EFoi Ů4 A]yӛ|Y\f13y}B|}܇7ES dnrkŲ_>ө꦳ڥ|/k-vf|[mnr +&Nry*.vVcw3˧;ո(# +${2*ԩLdǮ2SWQ]qKȓӾ Drr**S{h㵿TީWeFB '2}JX~Ru^6g8!uBW\NE]ej;Jm:u Օb>!do_1ll}&}8MsI0εrE ?3gQK'U3W_yYE*/,O g-N+ߺ$.LBLAI B01JRHUЁ9ɽr(gTޘd^i N90rY=ysZ2JpG*Oa(zKߝ1}T<"#AIR` Ed6Ff ~:hdq>fEޑ} V\9N!;}{K{kYĪ1!Dt$PL U@ '. u>s\ 闝3OɖSd uU 4ĹRN{Z$K-1H@RbDM4GW4?^0 !*W8xKc@XJ 4+9ӷez4YdKeI3Bӽ͙slɚ/3mJT/}m*ה}FqPR)l4J3 * &3!ig=wxtIn1ɭ|1مEI C6YCg1p>rmriʙCke P#g(,!֖̀pK.(g Y1rZY-W9ia姐6" КtXq`\ e-ZT0G`hyp\x#;()_^ u =y E@@QFŒF9%#z]^1VAbP+LI =|WW< ER4वTJ?&Avz`Dp2,8ӉG9X* G7"8dWЅCEЫI6/`uCxKURnY 飭'.][< iydMGxcҪIWaܹFZىӍrTщSQ~P.yFykRNSh041J6r%:WlrFHg< #> 'dE'\}I * Cbl3|LPGs$ԟ{іK7G@aBy8SߔTʮ2Dh6l t `meg«,R"P#ʣӕQTBJQ*1?;)LM4'> #Av?#d đH-R64Y8%Q{  PHV'S.ygҫ|0L-FvZàF7||3^}9痿"{d Yn .v֠> ʶFmf$<3[YMGӺ͵qxqPܖȍMPYWz,UTZJ~˪T9HZ$BBC"|ZfZs&є8S1Oϭb,9A]`%&/@ ;ӓoca|D,?nZ;NvKYּ[ oc8ۏO\U\S(wY  s)aW{6j;6PGpaPMWN^JYKjiO 8^\Bm mzzꎯ-{Y]+ T*bs3:޲Lpw|,nq?Q,µa[~ә\^MMl9q"Mf%WQ5=qiUx&<݆utG<[wB: QTR5g@y3(j\wg'r?]6aD8VB>N3LF$06?ښٳ8&!QӍ<,u{=+NFM~)REKk~J Fu|𥜮vW_o Qx9si 5+zW];]IJ<8Hz)I=`*ͥK*,p<3or&f 9/\|k,5f_e1WN,,*-O,rϽ!9$U VTx@e6m vGIJx3?3x?d;:",0WGC XD DZU@0B|0-xj.n RBFW%kkl"q`׊H+)ڷ-<оCZǩF:H( W(irgmC(he-Ӡ53EޭG޳8>FW~Xe!TZ|T6)\=Q5jn8X**/7I\3ٗGY"Y9܌3]սc nqKM)&x:l>g3ɾ+<& s}a|CcI \_{Q{'!Gq/oQJvk0]G~;ymudr^`{lQ*d*b :.i2Y6pL Sʼn#ڲ6Q>pij(c$rqqs.AE`j\=F4EK(͋a=sz!%V*t]ެy!4Η~B#7M}@9JX<{0"aP AsIS.=Ӏ!$t?}KuFhzOU L+`1GBaNPD2V`TEPOG!sU̡CԬ?&/c2'eA)A*'$6:Oh2TzJSC( ^|7 Uw4mbHIļ'e K v1ZRLB6`'gh KRYCzɉbS#6<3 #??{sT'ATО$قx.sN {۶\ޟ4 ]o0t.gaƽLJ}Jh3VN 8aj0)(Krr甇7Åܝ ^6k"V SXq#ݙ\!.Wōw8!>g=uRi>;\s!ܶ/p׻t+OMwg5T3*$MngObKn5||ZSMMniFg/7>,(F}Ӡv>Yngv!Q6&;هs7O? BFs$!Wt5 FaU#aQ8yJ>f=ѿ^r{Y?%FzWEu:`f!y`UȏJ?'7셬|h aor8ّ6'Sd· 9Ig9]]L^>oHeŚWUb8 s4??/?|ח_Z'^!ru!%"UP  ?~|hK ͆Fl04%z:+XWRr>n 1 ^?ݸcmK< E'e̖T3J'HCC3罥)K$FEN]rgW3U{5@󨗗_1SO}F0IS.sl2Q'P`#32BKEw)@mY'CdHd24@v^F51.E1q6Q8MÚvɽ못ʝ/e碘 <&뫻ck|fm+,v>IzImL;<~, >z *T`.*@sTb V`h 0"DTFf-N1&IN+cb┡`:&YtMJiPJ:Ҟ8O{(e/f¾PV_u9M_wr\lOnz0  F{5r{5c S)@1h:J6$l- O!.2½geɕJx(& ª Ma]L= Bpy(^vVkW[U% H*0xEFF70 1SId$]ΕQ=J6uJX6f DZB 6)MyԂlH>`(PTcMLh4!ijiҪmҎ¶`a5\.jFs-A'm5+XiOJF*j+BA>{I)&e 6¡/Y uD@9ʤZTY*P`"9AIhy8)B19jCRA0Nq DGdKu Ju&AUI !#zj1!15Ǵ҄ 4D.򥍏.:}{.:DL)e@p+DpVY>3]vHe};jFL6)'b;GQxG sŴە`x`Q zK+p Xq6y $pTkd@QѨR i ^ހ_Fשp~hYWi:S+jdRb+oV 0+oVͪY7fUެʛ&ʛU*o*ެʒUY*KVeɪ,Y%dUʒU2WʒUY*KV%dUʒUY*=QHcElR)+!VZئs렝ʥ5[ڳ8eK=.u#VNiV-ZZ4fӬ+$BԤrժz⼞8fӬrUNiAL=IѣL0+<&BJa|CcI ={'!o~P-{k[AG<1f9h_kg#R &cX!/I}i`:I& . /i\вv[IQD,q3) )Rdmr#Or4-2_u| QN]Pgm8n L+`1GBa*H єjp(p< { }8 pº$DɜPe RtHGSb󄦜$1JVʷ =CښvO-dbʵO 2c"q8b4%lKOL+Hei[ϥ?;nc遟ß;'EVЬ._﻽8qОNqg>pC@]Kz s,;fOO.wjSB 5yzBzyOs2{F4^\P@vOF ]ܑk8<\n}b4F5x{=[Mߛ6[)XO]  ; '+m\nP|q&Ri:\q>|}usjU2]%\ٟf-d~r|{TQ)(%Eإ8j6ǽ&_^=BoYѴug<];oL_k͉uߺ^LVǕ]p!7?_4UHBi_^g|.c9{ gg@3x2??LǛ ͍ah.Jz;+X)7q)97ZD&4n|h#*f`$롺Ӭ g'30 `O`/Ix8 $be$pemtoH:ʂWH}41 U$X-ёD W%|JFQwhLV?%,ʉ'94P/Lr0C0Q"3#e "=*r5ݩ$?*{%?B)x^[E xcʤs ['J1^l(yfo1 Z{Yw.o_QGO/:O"z?izgk!l Ӿ.hgfK3Ԃj`mt NmJ[HuH sHry [DֻTs_ gsCLֺB8uਵ4ªjQVlׂg7i iQJ({8sبr=\Vח%rG-1!wU$O }2:3>3;ϋ9a.SPrFLyRE"HUЁ9ɽV&*oL2y-7N 'dSV2QTr-F P ,g%ۛoA"6[63)ۻDK `.@Lk^ɼ@X촂W5 /%5<:]E.>*U%fɕX-Lmaj SO&BGDʌ`3ԁk$ ND!:,q$R M7=y PHV'S.yg+* Sci]WpWmH&VwÜaiJ YS[;t8wA5nL֜u O-]Owݻ܍>qtDn$m:[u2u] Rx'`غsjzsіNmDͫYw-+lYMun7-o|\ߡ煖!ghof[qλvQ!|tKӄ)f^xueYs6Y]vvAƶFaK.Znm~$\)+Eq| ͗o-q3𵀥ΨT}(.T^S88%Ѧߨ.OR͋,Nx:_zUop}JZ̞?4Mg83?f8V4nT̼X].rEP͵˾WEfP``9k}94gTd֮M%S/C̙U`tY+7ܜP-W!F!HcO- F9(J)JS) ΒVBn Ȫg.==)! 7{Vnm({lTK|_\Bp؍w\r!P*V2Hf+`O0W,VBȂ Ł",Vneb7u*tQ?]DmK,FO71;%-o5~q1-^.,d57EWt4@2xĽ%Xfuذ6M0IYUC3lwF&l2 c=G7y,TZج3%IE_ӕ{}y٥>]@935jO^Nx 9N F߀\|l*a$)p_P|=01g6@ HH`"bZ0{eF.Axc)/Ywkۓۻ:soCU ¦U'\b) mB|&0V$c w$UPL U@ '.AsXwBr9gN+蜟p^zxg&㩆 XFIBJ,$ΕrH.8 11H@`X2ѥ` *{&#kyrN`=o iRuRxTw(]^9af֡'sU}MYnBənpO B ť&5q!6Homz攫MqipD ImRn$+ vMx'I' g.]Z4"?)$mbg1p>rmJ-zБf)K(22P#rrMXB2-*$R<[^XΊr'f3/F1$]D@nbHqcepƁq;'skQ2Ägqi(}"TQ+])*D&n m$(mi|rVYOxcq-*¸I4>(R4वTJI }፵4 5kRTV^&b9*EBWj:1ir41˂-d~5iEf5gmٗ4qpzKcUvtIKhP Au4Ѡ:TGhP Au4ѠfvU*]F.JQ˨eT2tU*]Fοyq0D +jJJF 6ʶR`}Â`dW|ϸK^s<ˑ]gau֮>4i/F}Z`mNlxiPZ x|[AeT`f6{PٱK(@ȄL>Ւ!C*>J2cJ\%Q2{xh*阄r qXnj2tB0: C#&r@ց  ͈4qG *)nKl-A7Y&6b EVLЁ+L!J3"zƳHQQi-,Nam&Rנ|丰;, JhY e3A63Zd <#&R݉j9Z[ˑkrng_Ÿ _bgegs%Vb"i];RNDmG<2&wߗ(̤7w.+m) x:sx-wD&pK@ O1L\S2dd2%3x~_[o5K[9810<&洧厙yyR7^Ӏ6@:<}_( [>rt~aLajcu>d>J4͟ڢ5=xQÍ 8gw\Cy儬W+~{&7ZڛR~a? e{zQNP%!__؋?eI/q׶ΤUMY^_$}ӛ|/{Z_޿o¿{@晓v; i7oߴ0MK [4-fE jvyC.-ؚKi0`S\}?yxGZI78JrrW؟yJ1[S(bڇcy\w;7m} U dd10"V`nc~r$u;u\",1B$%K:͂3 224f|Ό$;# KشSC,Jz{zuIWP2Y'Sicv`QeS36lBVZ:TVWBv|&YlI8LF8If3L4VyZkN}Zr2(Ir`kP$,`fTmF鳹E)zSvKUnLiI5d0R֖ixޕB4s eOk%y6YC\pƒ$vqV Vާ`TVػ:Kޔ+K 1[k^7ٻpt5܀iEA6j@6u ]X$ص958Đ4~R,כUND`ƥJ ę<¨*~z{zRX{LH`Ǔ;=m3v]̯2iж~z'MR=# H}OxeRtw;Ly|9}Ǯ|LP0)aJZea.} dw,Fg29rD6^(l_|p?Ϲfk2r˙k]4`'.[s8cV#p T(.htr(XIz)II0 sխ8QH\Ӥ.$/?ń"Ӽ1B*i-pU0*oV)N׌i M8.9.e(Y#w:2 lZƫᎡצKb\p|xBƃ[Sch.ƣO_ C ZnΒ[f ti"4A|\S7Cg|u~P\?Y/~^FnrD'1; \0't a 1roI\\],v.̞\W2ϖ-tHCL{R*A@f: F}E Hg4Z*9؝BU(x9^e w*i-灁1GOs1@VG<71 wvĶG&"^ .2\J;HHeԏ c8 ̲(-3ۿTOs`HpŅM1 0Z4)+u`#%t%%aW܊(TR2灅X` rF!g\8|N^=>l=S#{Z_]΅8Esq ^Nf׼_Ѳ>|-$gt3}*b)II2*c^T_p$9bHq L&Ztٓm6'J*V_5q/`GiTjyhW)z(8O1o+:rO0|%_u`ҥAok(u:UNZ0d)Ud$cK~xN==lH̡LHzY$H22Kctٙ0d+bg YpnfZwC7^-Za.ϣ]}8W<~_E.::Y~DGє)*v,Gˏ)W#vG#3~ͧD26.FJ@Wvc&%qs tU s]u1+#cudmbnPu qV1ܓc0iѰI4n`0T5F+a$|ыeu`:N˾TܾtܾX툯/@|]07y ë{«j+<[8|t̖ˈG, eS; JX\ Z 30 ,+p Y䮔 SM)☴"QzQ )zǀ;'d!љeG')2L&Ζg#ybo:_e+,Hx>*RA&?gZI|~4ʇUJwu7V7~O5cPf#yl8]u~Q}&BۧNި{ ~\L6^2.`tyit}zXlǤ1]n_rBYȺWi-ﳙl1端n% Ϸ;&b̚CwgoK3"Ǽ~cu蘭Ri|s &}x͗D\Gsf  U[ tTNV߬οR Nit2H/UkUVJ={3}3_z;j>~۫-r_}κi> 5T!RfͰ,Zÿq8\qݫEȲ7=n:,Yydc"k/%%DJ,s$#YdneIOaA*MyZ0bqk3E4-75o=|١[#s2{`<0< <1zP2(zL<姊jj']DH6v7DfҶ9d\t5p\x/YqEԘ~o,[v Rw<96wZ3֦jIꐫj"fr9#檨قMGe]=ŋy'9V֡qxu]i58N2\?Tv8?JS K(X\)aÊ; ?:919\o>nL&j{JX o<6U,xmY reͦ˪Nׯy )yujP{ʔVCb>J(DYW0'Tar (214/|`A@V9Pz8͗w~<Րp_ƺV#dEԁK\T#A";.E)7]rݲ;ЮCJ0}BJAY^$02#$ ~\^mf&ڙT49%aCWNiº2Gai&wSהY_>+dKJ}y L{eq.U^Q.TTQ3`(K`Ӏ4͆-ggtd{`"|ip2&s%ѐ`R@[zРh$00r(dԈ H!UR aϚfY6rZ]f)FM1$6`Mr]Ѓ2ÄRAE% CMZPZiV-Pr, #Sc$)l+IpLfjz?i n?ij ѫ`8OLFW 1XЛKKF\2P68Xpt]6,rLp5m#54` %i2HAh&hm(1іDh"EF:\,/Df |-E K8 ij)8@|#x2g51KclY !CfF|wX5Wd!E9ds׫;.6D ;`S(`*[藴,sIσ n2i}"VnpENjE|Ixf][_&Ӈ^iq,|n~!lȕ?Fǥ.Tw sEI@(K2-qVj',>ظfxWwe'qlU"w*yUVˮ+`ы(AQwmoKޙ10P c`8)DH()9W'ggƘc|z;qj˞c±M۶d^%"@ F/pJ2-1\`˼(+OժO|f )#՗sUgjD%##ߤSR3ZEm3>%S- F`: E Fpj.|F +4iZ[wٲJZi=;biק+񘣸\zp7.%Q?m68Xk9X`6~h;H9yJkbkشҥŚ"+T'X`T9F$㑅H>HԔ1тFQ4`pHte~MXoo8jUHʖm}7=N7ӥYe=nzݽp3{5᭹rt0.KGdyҦ⫞u\]X=(^C=&$hoG\6Km;0\x/6 ,۫CFV6Ú;<[/螉 L{eq`JW *U 0J-4 kx (# =rlU?s8Ԡ%? )`3ˈVxU DCJegřAP ## Q\1 A*!lY8FN6߻3}x3ݔ6`Mr4wF05TkDfPSg(Vz̝ o+ABN 8!G/P.JH@( {4Jm15#4S7Zn>OZ,kec*5S!S%_-`Ao,I/r@2 ,SľMypxFMHAM+ :0CgAZL@L%UPSMM `4d6`AҘhw"MCE`g_.|"3D"%ԄM4"FgzBz$O&:Q=D&4g6yi#҃Qt0 )-HpM vI3!X8Byah޴19&r6ȹQjEw9lVJHCic}ȷ.:zɌ/zoiWYߦߣf4jXZj`oJ}NJj8O.˽qy`ȣ:D[0Ha󅈊S" mq:3f,_a UdI*-U0H.JyJk[$T067p]GC`U)nVA-m=3{9A 3iS ]L-YZOA/ǝ `|jtO ۫Wڈ&86i# bLF4QIIF q˽~Ez~9bJY(Y8 QD {սW{#΂MKE\:!gB "iA %(AP(j ApHYeZI$"﵌FMFS)I5W AOn'&*(mxOSf5:}rń;U/Jpw)s&ǽR;p'rhD[I[{H&m+ ~\L.u!4XXƽ]e|V@|ڮNmy88G+gz㺑_2ÑIVHCL`,eWYk[V˱[ԒR[b%-OǯŪŕ>y>Ƿ\oez3^<%} 'nhJ{9,-g.Tsq{bu1uo 6q=j,qs %qsYWʋ_8}h d )៲ө{[#asGW J>j3T >% =}ej'ᛶ(~}8.?nۼmxrAJpPTa)9GD kG* (nMwޞg4#GY WXW7<m\xspzGy!BOg-r~MFBqTj t黓}wrGw'sDdf(FHYۚ YfC)8[@ٶE@ lD #mHH**R1,MIk Yq`7'$Oż]][nξBMFG!+{$X-YsЬI!"Xyx^ ]O 6q/XɇI=C7guOQ'ߦwvjm}xF}!P0E6 "*Uh4>۠w W7]Dyt͖7BWg/1$MtJU:/i8QBJBA6*;׸25, ШZ7X1xmЙRV^{$'.J5 =ߥ l^x,W?UuF%MUȁ=-TX)+X`}2tP&3E9$< N5@]TYᆵQf2xg}*li*?:sx kdMp<݁J B#C⡟Lppr00aK"{d*9,^L .qK{}Iz+w45ȿNu?r )yQ99>z3Yp3(bE7~t1'y j$#ywQj0˪0xQ|^Lrwݜ^=J>&W^IuR:BoX/aUjD<^ȍˋNiqe~?O?OǓX>/zLc5$"@(Wfug&'q~/u&[nXSgvǣ _~_W??~?z}/?Z~.9 +D=b迿C+jho14X\chw=5EWJb.M!2 Xƭq HWZB\lL RPT^bqAF`_NUc :یD$S߭ ֟ɛ_ӿUvꤊ2aI( V%%/ ϴ/=RLS KE]o+͏.9Q}Ru:2ȱ+^#GP.PrM&PVN<يn`{[ڣU`Ľ)Zz_ʪViΗURZeUaYUkqG/GI^mg/ Yh7t oT%n(JQm?:~X{̷-C_ v6K׆ b;!)M"5H5F^ 8o>_jh5i z?>=`:x>me+Xg8p}fŸL˸T`2NPGV$=bNt3-]k+|VÃO{>'Ǒ7~WEю/2\XCbc駧ݟc٬A(J/JgA\);D-ˢNYKʌ8UJ%Q.g-O){en ֢S0F\ofmlڕYl>Lb60ɽ60o`R/v0^ik{%*OٜQ"/օlt%o [XmTIKmYSR*! 'kSzrMrM*.s79d Tkdl&x;Ky,lBjcFJ5.5L>Od>0Ǔ7k2bhe UaRw '3O(lM9Xw+mu}P>)!(P6L$ ,&vuE\D41b7nQZr.wiǮ6u !$!tҒpPC:&W22ՔJ4ҶaAd| VKdD#dXa1‡TXNu67f܍S~Qb b3/"ƈh:"vDVc[gBDgX@V8A vd^\|&!J2566EDge=g dP3!h1S&i؝=ik>\cDl&-\d~ԁqqhGKfZr_\tqv\x[vq\ y"R`B.J-*[2dۍ1Mbōa3'.'7a {6u'-+C>R:Biڟ|Iɋ%Na2x:jXR 7g#A!ZJJ}mzۮmeeVL|lNʣ,2Y1NbZքBBRxb"[Oض̭|N%f\( S4Q(tXT*CF@i8wcF B?] ĝ5æ~cy:Ygq$b>D%An,%ߍG[Mi@7)Ich$MC1H'| OQonn \_3k >y];sζM{nMٕuf@vw-bgmw[Zqr6@ҔP"T JG N`/!')M-f$suq ڼ6X~.#0BʍՇZԫ}qA0-Syiр:#y =lPـTݣyH]8<|gQ,6JGlpE{Ir%x#MgFXh%57 2 :Ა^٠bp.昐"Ҥ|IVOXDKWF r'݃imr] LwVPebrqؑUe{S췊ܾn4ËV) bϰدc:bxEŸϙ7њ1p~vz§sh&}a(*ACV^֋i+6}?٘WѺI}Ǔ^n]JkR*y(R|||3e}I-*Rv<48Oj(蝶Pt!T0;anIAKEe.Y3hZyasyjͱm墍<8-vqWg̅7{wl~iFc7׼m?> l3TK<.T{ɺXLFIl礌*'An[k':1$X3w%Wl'w4F)YT6EZ;ĹۂoYO{{k_mV7sʫYYyV;G[In$uG[:i>IwE=+xî҂ub)AWP \UqKvs+p%\U/pUurእ%Pp| F\shoઊkq_UrUrXOgW$YiW,Rîh+$uzpEd s^Qu<>=}?WUɦ?F}My:߰]Ô3aR]ChdO:Xc 6өLzOEccR*g F#0}["K)B+q 5+SxZ` yjcB"8~ ZZMm^qNP f@bp<I^R&4Wb (B&i'Rp%K_LA(p!䇘Dϖ=[gl-߳{|ϖ=[gl-_iB,0m!U𫴛~s )H b{!Q#_KZgu/iKZֽu/i9;--$FlkFlTTZS@.)9[{7€ՄTx%Eh:zzBYȪn:Of}VY1 uϷ|u>0v[&nkV#;{ޮk\n9t{M$1jS|PZS9'kJ}ʢ﫩)>h_.NS+G?_KL_M//wq([y`c$!r AjjSFEʘODD0Q[>X6`V 1-nf%~aah4Aq%wBJw},,_J7elPj*3qj 7VHD Q1Jri`V@B2,`<0<a80 C5@U$(~l *Brʩ ' 5`E !HcNdYj֜IDNΔt Sg B̑b.%~E$smwz3G>a^&8b{Eنپ`æ&7a67W>2I R7.(IXՒȢ:ĥ "i5a~<(tx飓 Q]Hh Hsbhݒg򁌗"d^!G." ؍7z2WL ף)]8,;]nyA/ ^FvPpL0ip> a^NٰSV\G"J_||ٞܒfӧ`C9Ѡcب8 9Нi)("HAa2x4@OQ7ȉE96slP|yxrOĝ8")rv(ǔ`52B) TE5՞6ZsظS3 QN=Yko5^"v #6y(@{CQ0 ṕNUKm,"T9/z~4W DvE>U =#%&bs)Ē~!t$ dg- 3zC0~ Q Bv];hgew1pRCVaGrvԂD;Y$ &lݽ`%4A .j u…e_ˏ[=g^hƙ,_iWg^݄ OR]Tl}*G%3zTP)5۞Mx̿9}wz=#:v57:{'n7:9|SI󞃿>Y`:[˟܂|gcDQm-1wZ~FMDh'o&ⷑ[3@p_$ b:;-F^P7'o (oZHy,aoF'\U|u0j]FvLܡ[Sq4g20 HV^z[%&y w3 P$E|.n [Nd}fZk2x(v_ɿ,jZ<|磩?QOζIBpIb6=yt9 \1>saЊq1hc)at;^ӟ`!޼7qrdǟeK5S>*ז7əUjtslėg qOٰO[Z E/ӫMnw=GOVCX*zSZ|WHa)XO'@}e2+ϙC]0qh'@z.=yvʓy{|JEP"ΨAʾ|H\܃V+rެE%q!kH0YNmXf|ޕaSx hÉBAE%Y73@`$!j C52zj.YZ[[xUAjbKA)v\no_$[ʩ͇k#Zf%G3J?PQenb۾5/Կ_4yȋWGP?^ k6smK^&P VJP)լR4^V1f0rIRZiRxGhjr5W id45V$f>(<LC& y 2/JG,n&E,r)hW+/"7I l!bZ WT[(>yV۬ )ʰ+ۯE~DҘsZm?~~U䰫}5]z>Ԯ/``V  \Y~(El ֗҈_mN oGΘ2; jbv]ի{l9ձ=I|F}^5~Q'? |:j9dz67_0z6zT$2OMYh8;!4>NgѽzW@B v|+c0_Sm|w#UJjxvt=qt/Im)|Єz-ݬPE 9K!:R!*(tJ%*ɭ5ة[viy]iw2(s?C2 ޿ן薽Z3c>&m% G+OGCi~uݼL?L:5|u}^c ?IC,ZWs,GT:%}i2oCV=_i+^PLmmݍ+rPDѓ\Q/YFpuKՅhGO[Zu%#_)ꄀ8x&R$%GG Mf<_<b zAyo)R>DQ&.ygҫ\e.w߷"&#-:\]Wz%0$dLurKe:٨^wfdɅW4-#Vy) *Ep,#h%7FPeZ ;+-Gl`f(,יqGUcKIbj:"%m7 arQ;í_%hw?}q=ʽ"\T,@f V`ӈ 0"DTFf-[TbLޓl W)C#uLֳ E:.!*M-g;e,< -,BUʁVvᎈam5^nqoTr!W\z;c ҠDK4(AbTJ:hC%t6`'ǐ?32J%(-06)8EӔG-X8FE5!`$̈́NSr>\aD,&'#.NqZg1-ma\4+.ާCT\ . ӎ-p9AΞx[)fw=vܨo.E͊ +C~tHC}!,IQqʲdȌ :%i>VVʲ,C@dNDqR0HLRҜ!  ,ѷ.p T%.@=Ƙ244C1q[dggæO OuZwq:\.1rWuV--6N?s+m" ܣ7iUc]LM&q)&nzXao .ѲĬIC2Sw}O(qJS+w{s8wJ䥌U:Ƶvj J)7G%bO7?{iYw?vk7jJpJS.X)jQ"lG(Bhttak{#R8Yw^3lښa_*%GY`; 6j!<`L|<0jԋ2xpU[5Q7y|OP;qC..=# !g(!ay m7í%9dl&d@خxGooV7w3/w) Vb|},F0.hg`fyڢ24yK7o=l{ =xυrT'p@@M)e@[&!J z%x=n){\K.W(#73{#)o>G1UD!TR3dI.' #a4~8qY[%Q>SRk(c$rqqs.AE`j=F74EKbÅ0vZN/uiNqlhmOX5&ndB~Ǜ{^˭'K|ܐ@S$PΨR`"8ϙ)q AsIS.=Ӏ. C_u`QhqPdqۍiLϑ0 ^'mp"+dFS"!2[ {!ܧcP0 2M_"b2'eA)A*'єinltД+Ҡ;TBɃ٫FףP(L+!}RPkGKI#\Bx)x@%S !mE|Xޢo@7bCR8 ߣ.\ 4kXa[E&WmHH0Ēl̻!ߔaze^yqwe~]}Jh;♮xp2Ms8ÁFۛ]Z(BExyѮ9qaJW- fX36* #-F$=zݻ#_ѻq..GJEQtնN͛fT޼inMsQ׳ߠ]JvݿK)y˂5Al Q/tE>W}L5djp8wF_߁vm9 6N#X1\m_z'nzx|7R:ʂWH}41 }IHZ:b#(gK=kPԫ#s$lZScVH{^㫋@ .2X8gF)zpNhS''' F9+Brc.eiO8KNPJSa7 1q2BG$B33lQ;7/asS'Xj8\HM,ϜNy|p*DMP*jEw}!줳޴ P=E+oy`}A,Z/ց5moJ7lb,$YZF+l94`*'^7|<qA&W180KkE^2W=e'8.psH 9PΓ@C<$V2 J[83ejg1I8͉\˧{jwAx!TO|T6)=Q5j֎{ERzCxa=}  k<c^&2{pOMgv-5v/ve]:eGy0:6 `0d%EGuoy'䭛L8яp6\?N`)w.Oc4?#O/'8qw??㤵l?JJD? V2PC)aF{[g¿6{ _|3?~}Y۱b]^nY'+de쓕}O>-}OV>Y'+de_}0쓕}e쓕}OV>Y'+de쓕}OV>Y'+d *de쓕}OVɚo`PJO"8LCi`Mc'ziAHĒ  m`/mzQ4;;.퉁MZQ96 t9{p)ii Z)C:B8K{E锨ǼL#>9_ϕeP$ 0(a0Y N`4ysB9 n>Twчjdӟ,Wndg ,s0uyO7ӿ*!vlؖ7᫫\=E4:Ӹعn؄LͱmXµ'-m9ܑz7uB@{_)%vFaJ[4V 2LB%1pnPt4{j#e68b/bA zAyo)HY"1Z(:(w;+Ҟ«p^#q߂y>%O=bMηLMrx.M&d c$VGC RQCAe9*dd8.i-c"x4͸^kbBaC-&Æ Ǚd[ c:,_LOZ<~15 .&{2BN^A~)_}Ox:Wûן BQTޠRI4gAI` F9#QqL`JDIeD7TF$x+cb!@0H,:&%4DEQ(Jp A42ad,b-X(+ ]\fϸf.>47Ӌ/S0 ewE[ )z NAF;QB!Igshي_$pr"#{YM0@ c;blb‚*H$hS0b(.Ŵc_P[UԮgQHmWElR)+!譠+ck6] )\*[D-I]FR2zT`8hĚ$(cP8B>#D Tv0a&06)8EӔG-X8FE5!`$̈́NSr[aD,&e%O_G\63ڧbZ-.¸h*.V\O 86ypć^{H-2sa$ $-}\<,[rG <=pYsF̔FW)&e 6¡ /Ywm6yY7N2j %@}L.h3He Vg.SBs1|Jh~1=^ۛ}|k9;}b.?,ƼHa}r}I{(UY;sb6 ( ; 5 ,Zqar\[ $8LVfON^Pv:n*T2Xjpu+dr7UX} m}اW @up*R2׎ggs^~@wHks\qy٦@∾U+}:x/u0K3GruW 8{]# #љ!BJG,P  S:ԎxHK6. Ԇc,\g̸@Ċu(,s,r[R 57jx s@==sQiYb}ҋQxKK s r+YRͅ+BhUCji[o<=쑋lwY^Z6rlnn{]_Lp嵖CY[|=wR~<‹иv׭H\P-#nۼ~Q6G͍k+Wg#]B8՛ n\"*2YV|7#7dF8k"=i:'3fᕎְyNN#B4\Fcusqs т!ZDMDOdhX&bI'fo0ֻb?j"$QT۳;\nKd%Tċ/} :x*Zng1##BBt2 9 ȭȲ ޖ]}9k5ҧ%O?,s%S Ǔſbbs5>>ez fF!,Dm]9jw[s@k[cmx@;EM-d3XcO;u) jA_>W_.r(g]},7Ե9HU]*6\98x/lQ0^+=0f..d5"}N8PGd܆m+uTԅg( 9@0TV"Fc3Ͼo# ˍ@/04!Jru&t#v+Iv(o7; Q}KgQ+ȩ !K3|RaA(2F<$iB݅vf!7+29Id}BoBrR} <8m$ +4U-{&ϒkk4eBa.= )ZGѴ'6ՁZ+*rN/PWeΌ9FrwR2&E!ѐji#!(3N%[EAjUZ|?ie^dѢ9brZt00RNWҔ&V Q8:WOpd4xf(?}#?M~:}Hh"BKa.r-9y6L٨$_"g'YN~Xlm^Y:PTω'ײ{BMn(v>f̫K^i{޸VyٿalPvK]ғ^?ؑVzNGH6u5Ľsi |Dlb,}2 [5(cQG<71 ot7;~!miG\vHiT| 4]RocjwOCnY>{/xXP^i#{Z_rto?҉egM |O+>k-YEbVURP&pњE: ޤ/ uKR\H\֥ȲRdcm3d `{n1dὡaa2,, "7_w>P}4 f_M(=`+s#%z4'^ћ9ײ̉Z&;:'^PJ>̉9q]`moY+h:]8,ER*'uU*p ]Z-tUP/;tfJOt?]J Z]+B E2Ys`3wU ]Z; rp"]!GtU\EW*hw 5{CWV)0#"ao彙j/h:]`z5t%w|SX#~\zZ{$gI(]'Е깯k:24 {CWQ}bPrzt%4({DWp*hNW-ҕxA,>$;;NWau"iztIp%$Jv$聮 ]):*Gt+^t8؋ftE(Е6ȱO ]Jt JztUJ?sW}: J=LEBT$}j/pMo誠tE( ]Y4i#gPЪί kC;]1ˌ>Փ8v{\zZs$u$k t]=sa^].YG37~u+6q'Wϳ3s!53/GaKKl&*3= B3=1k5Ǟc7ۂy]2,P4 YjQDkWxu cļt>fzTU GPK(&:ۂ}UF9"sMP7+"`žji[PBnq2"9n'f<((>՚w|z s@ ԉ`6m@+M I8jic>hs C:F} 6>].ԺwJmb.v)"xYeo_h HHnkL5 ogISM@rA4nl1cԹƖs-Sg[ED`1Y<ă$>p%j leձe'zarv-{7LƖacVmȖ)c) I,,Ө<#e;5뭻C᳘mr?]3yYˆ)hM!q]RZzωA2)\ ciX]U ( G̓vpkG£oѩw9m6/WBE"*G+Kp5rvKZ x4NX{Vֳ<9a-u  狯Zy@`2Fe:EϻFϥJg $ 1T̝˳@ !U.{'Te#/"tAH <"ڃN I\ V#gT֩^Ѷ -]Ů=f2/i3;7n"xgir2nU$irt5ʊ2yXHЀRMVXv0W S31"%#>jeX6Я(ez+,qM0Q+5}\} rYJd.s>w*h SgT7!6Mx=苃^z?*yO?v'[:Omᢐx1cSM{$<3;Y>M=m&r+iL u͓/@he;BdOm)V_V7۷9tfwݐV6qnv~]{Y=v^ZJn%_yn_;6&vyśn9)mJUUk?{9Hj6fs| ;Z:8n7_=9c:ƧoI[*Ԟz6Z*]n7l$94>-3R! *əxN\M6V^` iT]]vs) G~G8pGþfOJzu!Չ 2,s6scJ]m)jZtM_}Xw.^-m<=ڪy͝oڥ%՘xZ]i* ԅ!e2 Z^O< 'gl)N1\@1HǢUdkyR^v$2iԖ[# 8غR%ʔ%.BpJ$w`Rd5-#1i>s9I91!xc]nnVn< }28$S"6K 0:SI Gԩ"dM,Qyi 9R "0t2z !H>$;)ϟgz=}aR`I$0ft !$7tIVFR ցԪI~H(!d+8$?`aa@vQ BRDfL8(zW'zG:K|(cCd#KM#6ȢE%3b4&\ WՋmFؔ\:_-9Tte_!Ig /Q<3缧B+cqm2"6iT:2Co;' 0p QCSY:6.5 i,>etcJ $J4Cw8i''jdȠ)eh;8rSr}NE: ɷ0lK}/]HORvg^YֿSʭ.掠G!Y2y -f6!fS6M8oZB\uPh;po^;~ju|bH{Lh]1/d(&Z>֨o:1x _`W5vV},/i9ھE.f?QLT[0 NGn4Һ2)u2r(k_mG? fr.Io7iz)|9~CR[Kq߃>+߫s=JIQ茖T-|Aqp(v5[UjH'J⧘Z s^oPo0-}]xxWen$?%(t2,2 2X71r^9-䌍ZǮdᝡ x;6l!eayjke ]qoοq>L"XhafRʏ25P{{LfS3SsM͜usl~ }a2ZFO !h$rb@Mo|h 4&i^M9ԫ7?.ZȥUcȥ Qgd2"Gq6Ka9r'#I_j>7PZ3v:$&fĖĹ)bČ2b^`WdYHUoc }aښVӗ}n`.vPD#i0mF&5y4攙N4^]8%H tivz(D13O+<:LY)γ¬!칋YG`h|U l%}R(gZ{8_K C!@X8 ,/e |($[wqVpHQeZZtuHP6 H8AHZcL@٣V6mM[Dr 4S{xL.tp~O@EWuo=|"$u9& FH-M"laJ~de@oFqM-q^i;'6 -Rq#C⾻>.Owjc 8.|Q'>HiԡKݽN2ٹAHrBIzx.:DL)e@p+DpÒR}K̐Fؕ/%vUF~N5ssONhV.h/{X[{r/D`Z>V!K, ]0p~:QIAN}Jh-wljx_p'sl+5aPNXtδq |3" :з^(Nn`=uU;;w˕sqH/t,>|t^\XbX*.\%Q‘=n¢jF3$C`.lޱ1us9N!0.T7B/z4{aF ׽Ln<.\'gw](} Swv^,(BEVs7mǂU-q%!niV6s]UN # GFQ3hؿjzh7M\=jU"D,Zh,$7,Q}~C~+9"~H;+wT-^Z^MZo.?=Iȧ$UL<HB/OEYG xPM0{ml➮*] E]i/g?_{}ⴋK䗓g8BJBJEx(yM3V^47bh׳_]v.E!A x X~WRoH\Q2*G)d^Yylnxw؏Ix8 $be$pe? ֏ht J)}Ne+PB$>H\*H"ل#z>%B#S# naԦƶbï.ZN2$c9㜱*䴨 4gGS}Fz!p<Y dBK"l饷eb װ?ImX;_Q.¹rJ,mЦSR$2U[3ɴSL}udVgZ='}Y91Ķ>?;9 1rԯ k{4TOțWbM-\ L&'--{#$S q1;ICɶt{rJa9OIINϥDlTBuh$\jPAmYxEZ@d2\:_ ģh%7FPeZZ6YYAQ8w㫭*iߚ<(ĿJLZu_z[fsGVVǛMuߔѤQgoߕ4UΔJ*09 *pfNl0iF{gS"J*#B2:Ř'9⭌S#uLֳ E(%5q>qְ5x(PpX8(mz* MNI'~tc7~㈝1(W )J^RN1Ɛ8 v`Cf+vrÓaΞpg6BXxDIvĄaU HЦDmyabn jۂj{{u}T<"#yU` Ed62F+:hr:I*7 R^tk":A!q|F8Щ\xؚ8ʩoJud`DlM?eD{D#gyXut6].(WXA`yIЅ9b!9PC "54eiʣ,e#@ГfBsy[ˈؚ8kT.?=⤜6:[Ӓmx l)ƽ[d&DKs7,$ $-}Fpq[5x ^xԾ]V_B(2 nv?2c|cC7f@j&o!1wvջux奛^&^zQ[&7.#]cx\~I+7 YGJi;vavQIGr+X]YEę=O_xl ^΄LC^¡WA"޻Bg.o<\l=1y>,?7G(T6nѯRˏfwiEx8(N~U5w/̐E3 /`>-v jt9z1ٍfT^~ lⲝ,WYJp•jGU!V'9rŅCoVNj9M ݦuE,|2*yڗ"O6ҫ`J7ި݇!`s2{'^ C΅St4U'h;zxA5wn hy"^~.D\[Ob2BkFE":{i(e$&q6W"ңE2܆Y:.mYZTb,рWTWV9 y<\jxE[8ݚb"NYVl)%+H?}n r{N>HO Lb 2'h8) bRIQ q[ dEƹt[^'mmT%20^L+Mxbmo4nK/"BKEY| O^\UfՋОߝ3;Јj]ZM>Ol*}j[*RE&%Ue2(%+ϓ)z.l! S*nڬ<]A&QpCL qj|xԖ eV!g j"QHKĶQPxLSgt7}lm9l$nmC5s &eJ,Lʈ[&!Jgbz|7:om]jw}v< oɅYN~x;{ L9L"gF_X.-jfs~^tKo~^Ea,*H;wxÒRmm}K'ϑFy7e9Ϫ-#??7j4Nh)/{X[{r/D`Z>Vq!K, 燲9L.\Xj-wljx_,2䝺C 9, z:'W##trU8W_sDvh1W |Zt1~7k9_l&!'jRsT1_ <e%"6)pFA} 7)ʊ{PSYw\9^v_<"<}ջg/N8tO~9鞾|sL!m+DD?ZGpNެ4m55Ms#h][Odr+<i HyI\vȷ4 ͦf-忧hBf CwyWX}\"/e!( b?/K۽}bclsk8525LO@5Mpy$yk C2Ÿ1lQ%+rE!Zuy2gߘnN|PsQFURA6:#\\TTb#KzyQ,[k\ ޣB"@wxuT6<3Ğ~u^Wm2CTO.'('ɭkG79JQO''[ w>gW&߮p&~ǽ`~h?aAJߟZ5bׇGq| Co5L ?RzА=TvŦ;P;M[j|7R&j^?4{ҿ&uDWsUA B%:r(mT7T=6;6 EZhJ)aU B4EFrɇd|g/a7qv~o#*RغG.yVIpV'Jykjc LŃjȥ: FT  `1Z]d-`DI"ɕ}Ż]QY+:̚3Nb…%@̌ocs!YDTUf]NDU%/*Ąwc/qvoܲ, Y1slWJ&iTM_֔m f8sod́ ƎE8^ۑM׋REIE;&JȡBʼn mcI$C, :֞+b#q#1C0eUcM,z|3j[:|mpM1ZoWYkЇem*43P o8}697t%^3im.gtV+VUwF=?RElcy$ S9seÜfLzPRhhl"&]uf+yXW"\I%PC䌪[KBYms]Yϓc+/oI\5yу fJbF6|77VĨma@(0s  iHSGz4Ii"03cC 1Q؈TTES+3 #xtMB[c6gc‹cvԋSx6yrhfZױSY?!_у7O/je1Mˋi&\A}Mo|Q"aWFn-Nc|W7ےkeT=z.&2"!WV=lCmJU]f>yU#s:HF3m6Fk 5{7鹿Qօc/`Ԥ~,`0"0/`4Z!A),XͲbW!jAo !ck.X ?ԟ6fryYo({zb4 `R D j0mLbkXv82f"[nv(GuϾ@KHkNglMwTzzzqœ/fzZs M[\)&MB J?W{fr^e"//[hp6YnUNoh'(S; CAdZ0\7ͻ1d8WrvIj'8 _&wr[z[~k!߈vy#Zffs?l|T f_.R}ɉv9L탼+;;fzMKd[K |Q{h}L~rr8\J/|R{x|quq/{ :օp1E6喝jcbECJ2f. ^ہφizmo<򾪝?%|u|#}:£Ǫw\|F:V9x*96ST9qz7UDZG_9Xgr ƽHSښ0߭?Z |P<LyqHkֵU7E><|{DDmB}?3oOu_mS9R3Udrb T :0șXjC~B !蚋:'Jb"Ș`*#A _wnlM.W79_.X$_sȃ);\yWȩU\#ova{i+yDPhJXgVY"%lӪRtDQsiwQNIA?luYgQ'BmJqXM2M؀Q뷮zG5\5B+a` 9v k }Bvֳng3ԌSӆB_AvQU.Z7K Lwbj5(4i %$ EQ2MZ/)͌8<'+5JMrch6HI!R=禝 *j6-7L,Rdf^|DXT/aU)  1\s39_kkVl3o~gw%K֢/~\:_n $. 7ۃ&A;n(cRzdLxj}:Ũ^?d.n6^:nbsS/y誏’{X7ǻţRwvq">l lTk ˇ7JաĆS}zTnhC7vOmܹFǠ9aev%xts5G7LmWi[ppdZU([Ewu[plן<4?'&sSh~:5rs n$k lMH*Pl. I#ּ!qcwdʾ^Who`NU7_JLo2=>Z#n* >XGeI%(ҕc`g6k5hrʤUk NZBo3lا)՚qfCW es`k6`Zm-#*goCl^Ça"T.\j6Jk婂}7f )n4Tkeީ=Pc1c1nv"bz4*KNI{7~!<LG%:qɺz$(sP6{8ʎ;P&sT(\!xB. U6ރj}"p͎ZG,D!_>GiǢ!]kSi/YTMa+ib3Mox<@`fQGܬ ÄBlcc#LF8)Pu>#e)9oL6v_dUM5T gZ6;=OM=n2XYGGRܸ}PAHdW.{dbkh3+e`8cӒ39Zo V<'Ҡ\VѮG1#/UUq1ќ,0` %Pl^BMXcqȝL`Qn[ǂootjl{Y.T!ja%a1-7R4]'@ dw%S+EA+uXFSAyCs ~:X _4ky4D. iyM*}B6Ԗ!T8,yѼu`i /!dɧ`~XlTbrѡ2X6obM=S6=R#O_bVnF| \:&8g۹ ?-3.>,=k p[O -I+} 9z9^Bkڝ h gŌ3{v$=`:((" pd)=v맃E1HYLMUQFiv%&@?xP*ZE!7 תя a¬0, aH'uȳL+%(Z* sAnYGIʃ55̬ugPR譪#X}2X@HHX1p`_lzt LC\[4'ܳ?(hAOr|d$NԳrOQՍP,6\b&ɠ\ .zݬ9jŢVNۥ!ˡG@lr)xIF itbٹ4Nl ]yn't"C@Qrt)eY`j?Ưz8w{qf9y u ʰ@ 2 .L~e4 uf@kyO-uᕴh~7{EvX?Uw֏lpf;p΢,9Olg_5D6zޞ%#夕w ]/~3Ҏ[uƞ[ Dzs3\V}3G xPV1JO'Gϖr?DZil\'j[v=[;|.[uS)S&^?9iGwͥ|_-%fuI;;,zmK / P;57sٻOgm?v/;>_)^ss/n_ɼG9(k:6$76!h f+9!?iǍޯM֟.~y:E^‚ZlMZ8W-r%`kj v/:l7WT`vH^UTJκ!S55fCf^7{j8[qNhϏ=(l]NΝ>oWݟ4Kܢ-@=GMG!HS$[+&KLSFs%eۏvjN楛/vɾbGo[7s~C)%R)%R)%R)%R)%R)%R)%R)%R)%R)%R)%R)%R)%R)%R)%R)%R)%R)%R)%R)%R)%R)%R)%R)_O:5{J,`gzJ,<^Va<R,R;O%w:8lq92mt >@T.~q9AT˼ L[ƛX=9lxNLgIei]<".łc qV._Wtvsp._ahB\,B\,B\,B\,B\,B\,B\,B\,B\,B\,B\,B\,B\,B\,B\,B\,B\,B\,B\,B\,^X(Cr@\,(Fњ.H\,^C% `D%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J Q^%X·2p0J D_ -)%kTE|oE $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@z=JϷJ ף~9U/׷ ]ZW7dz/D:$pK#D e4"\z ¥S#E%>S$òU I3,˜tuOU븜}otr.`g9AR<:O~ ~$0ǫM|:љ(- 73^}[f I[}XHQ, 7X['xMlf +3` MmEUOPn */ꈜW/vX OK:$bJ+Y2<!X`eOO/V?of$7E\l Q0P'[UfT=/5ݻAɊs7U47 ]qq1ۀ4ԴW~}mKR_z޻tclՊWseז/ݨ[ R #fzKkE#89-xb9f:Pe,Qs:ۂ{ VNFr,V40Qy82;/F+:,>z T? z[{nΘ?2?g`pHs]Hho-s\zv8^BvWRzl9n/.31?xF7)|.(Bfk%YӽNxH@:uFgi*l,12E:õ !7 t+Ydz6EH\˷VdRymn0!BC5R\,o,&/= BQPHUQ!P&l^[a▷6 dmύ VɶQ~]\Uvܱho!9GieH4TaɑX"uԗ Bj)Q!+̑6Af#g3,*$|FCKCF]7ŕhf;PD~ .g:-: m*Ip.F#) Q!$ME82mA[308G]u-}¼l}UFzka4eĺ`]O֬հ7??:u> ͮju!Ep- S,`*/ Aɻ`UPs(,:#`nߌ>2r00IjJWȵ6*yAv L~{dt?tTIR^_]xrRF·Y_g`?~.Zy !px@f3G)a4(XDjrD01c̪st:ʨY3*ta6U̺ppNQdqgX:f ~&qW77SZJHDlJ#!@Q: r(=N뀽q Yhd 3 j-$M*f< l;t$LELŗT5v6r6kl;%Tv68-Z"uODjA=B#iHGL  @H $vI#01oqBg ,d@*4H5> 'Np#O=G Ҙ09aiNJb<?vՈ2FFl5ZAіSƴzPHԧS7ME/B g3jllqU[ЋSuf]άU[8:u1:B!“HT 25Nw,V!%Ջ{ыqǎp9#>p*l^d<xgYQ[яG~R{?.G1Rؿ3:FE]0",'S 66ԱksN47O@a8b g;0!#8DH . PPFPp`yz(ȋG VYj qGF.k%%RHDcllΠKPሷT"rFwgN`HSB@$h_ 6w3ˬ&.dcm*ࢰ܅ *FƊꬿ{p,͵Օal3;}-/;Z:7fϟ O|EB sH jpN=_9[`ϥ3VFtIX;qMIC/fwSJ $')-WG{ydN&8V`$o:"a]\KA)Œ/P_%P0 x+:mqkBeR!\ uWaR*Q藝?1;wpRԫ+jy.zrr;\(n`ӟNL^PN$)7)e~tRU[X`Fݝt*-uTW~./Nn,Lqk>w׽V.9M'ӟ/m/_1Fq$Wt4 iaY-A`*|z{۝o^F֏:dӨMsŨtf>\zt +v%Au`bw\ߔ}}cwRYZYyNlnڊ}Ͼҁ5n+ywq#֊*LUb_]/~}{ӛ)|_Oo`z7Wo~ {_LꢁIY`  7m?4fCSŶ|q9qx\-[_b7)_mJkmX[3[cA*,PbF_@y tƾy4H)ʝFgȌoRmaAIU$$pXR ioPD+m"X6F}QwhL9%ɬщ꼦a˝Ӡ4 NJ |O`IJ੷ C,P0OD6ةNRgQԉODM:mxqC{fPA/R飺Fe 6zkAh)c`Xy0[Xnu OUzM|[jv]2.aE#gJ琦eR\Tyå C*|sOgXp-MPT MX۴K-oC7H݄~g-Jr<6,r໏W_?wLhosnS;do=-8N~>y°L׋c9K/!ĊUݣ}9gzGz)f\҅]$fxT m*CA" bB@%yz׋rD 2c*#7yl)5 QKhVXH5R;ĬW逰 ߜڧqb1Vr}L!}#墖Zu7֌*sT(F|`j#Jrtآ##!9{CGksT' &cѰ$i/9hQ[DH(򅖜N)If߈,CRixxXuP3)d3&wrmZHJ!P=Ӹ\;=.>?>sL*b.RF"2G[0oohvI+ r,C.pVI} R?c w4oU<299a/v&nvvzVL'7=whGlõ. 2h Uẓ0, Pb H F4Mn 34B02(#/*Zf,%! TѡMnYWo@G$6%̯|1]mST-ֲ{Mr V0#*(K2>dkT~SiW8﷍Ym`֝J˛m4Fva`]¨`>Kn`E-@~ZeYK*RT:RT,RԧV.X6`a4J= c#S+CZ U**յ,X WKjډcdsk5cXk5r_=RpM>ǂ>cuX3꜓ȥ\:$j9=9JA99PU)e8S,|.kmH&~:$@'b /!THʶS=!P9Hkdز8GN $TtYVչU[7\lj=t2p=lD% =Z69f#p3͎"jD=2:pBFQN8e%#O-[A7E+㹕lzg8g01=s_Tε\;n*0kU%3_ȼL~t@Y?VLŸsP."vDJf/;5SS;5uODh2#%u` ._]`Đ:,q$R ٛ } {x%ՉKYJ{ZVS[#gCb7?Bm>~8rymJt8jЦ3(2O/̚p>u orVWrݞD%m2nq"뙙#75߀a[ϭytth@ˡ_?eزEŭwnzlp=\فz[r9Mx~;| PN)c>EEdڪ|USKy Mi(]K Ugn~y6,39<|mz(>Z> ϗ 8#?ND XUfj*2VH Q1Jri`VBO+6)P 2F 4J[FOrjC FsTL@H\"4$iA0KmԚ3əy=b,9A]h;J5r6wp?~&N78 rw UQ2v`ӛ)Ǖ ja?|fI 4Zo\1P><Z%E%t.7KADҜk|<-x2;4BO<BFs='{hOBBH+FN:P퐧 2{4=0r24rJ  Q-9{ԃ>XDuRߚ/;o֛En8de!m .>ْgT~g:AQ ѣ"M{2Vuon9NGZ3(TE QSΰM)pJs7NCۅ[#a%4cX 4J][9eL0lg_ũ_MBS.X'?zL/(o693#Ha4A4\*T-Td2az"5)$06ġ$wbW[N@-4*@v@qJ f e1J̳86V̑f)K(22P#rHMXB2-*$RPye9k 嬔쓘Ĭ>]èٕditXq`\ e-ZYf̣bhyp\xsF~#DM j&n m$(mi\Sb9Vٜ3DFuVkVwvRg'-w< EƧ 8i-%tm5Hq:G*t^][]!f6?oMq6`'gQ8 '2`@rX2!N;Aڮ&Pd_\_[,𩖅xD@@DDuXˡ >z`De6Xpc-Wt!JnY:k<5ؤ(,eq]R*vHMt6Q:cUū?`E.#/\q22G%WU]=ꪻo1LU&CL-7WJI;zp5KWޫAeM(ׅcYsHxw{k8`0L"B8+*CGfCYD1\]E}qa_eyg0L,18drBqTez2`zӳaiR+nFE==ן٨% +'Z"\vpNı<OPHQ1_H/D Bb` Fx`rbU 3s -;kC༅HG ""[x-ʣd>5VYZڭӢq+m"YUXS5]ʎee ׺[]Vv],zcDB(PPqo kjJ7$H`Ll$W){(&6Rج 2LWhbsfWYJU&CLW++=V \er\`dj_`d*M]F\XvHU&X냁LWKBվU^!\)n>$JqK8>Np5{WH%%WW)r)}>'n ~M-aIQNvvu9~SjiNb@}|h/7YOpA)`}8?+ 9?S+J;5*o+s<'IG=-~\9AIx])Ya %B.٥~n'L|3 UQ)m*}*: ^[KrӤx>^E l.gvm S7ZL9 BU㛄MMaͯ50h\b`żGDO^ g'o֦*B\aj />O+ū)W\t}zL{y {;UN+ޔr?Oi%봌e6o$h+ Vx{xr1?1N{.Ω6 U!k>h($'O /'/?^\ܮHDL;@y޵4+rY$$ԭJ%Th؊iI EZѐ4J<pg_7<&\osi K[u-\?^W /eB@q0`,.88}ccgj(mSܞ7߱.@-d(sD/'ySMN\Qb*PaoKbP'TUʓ*CUvr^;:фg8k'ȫ+hug*[0yQ(|oFxrZ|-Iqڄ u^Bz;rVw5rӭ3pty8YϷzI'bҲm<DǼóY?*u@DZL3y>찜jomP\跻O|R5y'6 ZJgW~7sz4ڟr@nr`(@k&=LcΏP7۷Z?q-co- {v|xtE$m3nYB\{bWp ?A7zet TЂfCg>4,[R@7As\NQE Ɵs:*HAIԏ8L32u'n'[^Z aJ+k@Ic3)Lozd& 6.iJP TU ƪh Ib2s9Fw쳉oY('] F뺛 bԠ?Uj_P\}S(j[9\aH>󐃍S͓S_١(d|'8<+C7$[BeW!jKDL`~^\8;9],h'\qޜԦ㇧0tʛХIfXa'!^KfF{m M1NA#`'05eOgaM,k03X]-y V*[#]ҩu$d5:Y#ir_}'⯷M'8ڟ@_kmkĬ_^}5GZ璊k W H /ȎlG9`" ,ēQ(GD;EW[G3aMUȭt,PլzwF~ ~g9,@dW>źNyܯ<7,ϼ[ .zk>#AQ@ʃd fx:l|G%yyT<*ϣq<5M8ɫ+{ &!PdC"d{T(T3VlU)\dM M = ;l1yh"v#A Ci*8H h!q#J̓3?hNmQ1 z%x\@xWÚ̮TnF\Ν"HupńF k$LGaSV<NXcbT9t8&rdac\@T"5`\2I!L֥9b') +?Іq-A X,@p%NZV`ҖZ{s@4AnP+/g4I[=1IZ#jmsF%WNb79q2h}tbV]Gűsik܍'jk,:H^#CtV4`rdMq(b@`W-1:Rhı:P$m""B9ˏo賙؍EY6Ӹ*Ū HnBewU0\~9賌VXt*[,'yl~a턴iY?}Cūe@3y.< _'X+K!Q2=D_ൺ(kwQWXZsc Cِ)r ZT_wS-tN%TGn™we\tqMo5N|N獠?Y)EQ|Lu9+ϡ%e ZUz85ŘbkS ۆI5 5טV>jtRb:Ws - >~ &}X7wuߛ=‡v:h^$JN-JynmO2+撎AV֫w^bIk:|AVziw=޿'wV8󃕻u?׏{>mq?=ꓻQ8eoZi7~y՘ U~M7 %пi ]I-|S7[i -a! cOaQOݟʋ\3`$29 I-0*:B mL*1Ts? 2JDž(6!H4~:sn7j|o=>Yyvs~U~`Иڕ*@` QX٠I#D#4^>6bE1X6&sQ#*8=sNU4tt .rUEV5^ӚɈ%SoXw_!&-3y `eaiL)&O:RNrdŪA'~JOT55+ɶTuVtfqj| M :w [c3W_njMhkry\2ϗE}@OtY[c=Rt-:ӐmRb8mE`R~JAy7ǜq\rSK>V3?{q[ Lf|?/Xڢ}l.rN%Tj֯SHҶ.O~oh+%z]3Y9շT?c?=uIdo8׊ajr%ɃFF^BL%!rS7엟ϧ$QWϡ3 g d]7O YKK RB|!x>)IYxqǼru:7iRu!s5G.WE^yBצYXUò3S?;kg7@>DBZ@m7~g odzZ.4FrrwmBF",pgL>9=apgVI`4zQ9V;.b h}~[&'dVE8[[|N`8}զ'&F6LZk{0ؤZMVӾf紉y] 7цѺ3okgnA}3k<#TJUDVG^PhY^0pFzBJi,Q"p5X 뭼#3A[haF)%0rj`\7:# S47i-N9Bau %tC*{olU[t9uڢ;d%jW.J\%j>tJT.\fL@0%W\v4p/]%*E']DJ~_\#C"cD-?xS{R^ \)GW@Ff0`MJ;zpGWZjDGJ䲣:\%*UwZs RDڌ`o7"Wbpj3*Łp:vX#+ -rajk  :\%*;7 %Kw:O_Gb6 pИ(m5cJCQA2e9#FNbCڀaޣ-ffZLAsҞ{r2j9J;?KE.9>&Ȍ 7?b<'-|1f JcZ%BӮ^?EI-#wTj, +à z#(wR{u/=r+Z+J>5^,mJƦviLb?{c6(BcY3W`<,l5U䟼&n,(O;)U_[ؖq_j-7f`7}83f 35 Ql Bι`1X> qPr a[P2u`{}&~g;j<y, 1<Ncʰѧ1&*ʛo"k/9vª3|%gSѨ 3|E&P-.I?[Y R3EyKdw1ʲ׋neXur U:131lm'+A T yD֞D>k# !|,֞D->xt=D<Ҳ2q])Q*.Ji_O҄]/sI0^KuTN `J)atOqJ3mrDGE-R)JBbN02% aں?k#͖ h⡯aUf,.b:]zLߏ(#iҥ |ͫ6^2^Iq0uY;9is7 J[t3ęSWݩUw;u՝N]uSWݩUwz j 7 %N]uSWݩUw;u՝N]uSWݩUw:`Ua]pYOTc#8'bqLrڬ>GΘ8d-nKPf.f#-C敳ׇC].<(V(^S/I)I|4V`f6[ l+s^)@3&@\#ǔB$|+(2hݮu+z%6!`BL1jveZoAkY˲yk䬩``Dn_/X|%= 3_yW ˹HJCMO=e-&ч  DJ}#8R ~rKd;Ȟ:_ubԩԔ5" 2G#p2JaQiƝVc ;"ꮂ<A~, *T] 6k؈9 &1L`qP \ G47$oX!Y?9vܾikzn#JY=nz0|) F_:fy3&2c`goſg6xR$&e'}el/@Kd l4 1`IU`u>ݔxh\~~^7L֧2${$tVTUR*S+˙ a]=滷wo.]yΗ_ uI ɝIi'[p9qlko5Ul)ks/#moYƗ@N h a6ׇu zGxPG꾞TήV=Ei~MM42j2 |J&Ί Xp^EB' Kj HaMƈSm"a^n?B"kڃhb-wN48eB*!4=3&)G"^Bj&ٷ0O,FКf; .WB__˫o3䃵BJkB&ΎX&vI+:oQ#cmV^n]ܥZnR%!Д$*R3`OLj9ԧwXpAQXϒ\%g̜a!5}6dA:|:KGG8|(E,[~&>xQߴR/&Ѿ q34 .LJjj>$34K[ˌ3^+Vy$85NN{QX-QSZʍD ]ѫY`_mvXȨƹRǞk+ |D`X], b1>XhyT`:`AܓWuo ɅPpTǀMٻFrW1A7|gq,-Kl"Yuٹ[-[X/cSic%nU=* YPۖ{"$r(e1'1HQ"rHv4YAc\,a3q~:Q:ܿ7//Mvn>'/I JK[e&Ԇ.Ⱦ쌣JT6LVOX*ცWұ($L"Wj#]9z/ 56f7T:tEX*'SI7ӓ'{%O?pyu߲Vsg\q#8ڞjyXS߾V0fGlF^\Hг%hQGVnlKf@.Z*J\BT)C)=e'Y() *Ͱf< ]c,# o%׶ғՋ}p"-g75lXs. 0U2E[ R=t$؁S}B*Cl?A÷{1**lj,;M E@&1b7g?bNjy1OiǩQ{`w]w:f6X]q¦ ® ջmc<谴.yL4!g( GIg$S0&ǎ#3Bc.,w⠚Z0w Ys-._#A;m71* (_kPsجӿU\cJ*üJ aa09͋|_?*c¬,/%.N:1@K Qpt,G+.]cGq@YEf@kLrVt12t Im͟ȁŤR,Ash+McIF _nL>#twInH_ҁ糿}c#YWu>ƈ"b:n=2r<<>Y2D @gu]䴗A`t~x8y4o6 x`J<`_wp_^5hlA!N..=>n]ϩث[-8QB䠜ӹMRA Dr !Ǒ䍫kå{{1yƁvp45=퇇OtbJRgٲ4hGKƔ. pDc"vK^K5F4aȳ>)2y򳯬}8) PGئ"! "d˹) y('k ij2 3HHEGFCȵl>5-@+qwXրNf;sʾ8gwx0|I*܎n/ۭm:F[;z%(s4(OlηUAZ1X3$-.g-e]Q:EKݧ oڷA }/<76Fj݋Hjɑn,ܩKRKC=sLl0ݪ.!X~dJZt;#u`A[\ds,<ђM|V-/sшD+)@JJ2VF2-_Yo1+˫ ķ_d:>Kj̰t" /;_Mw9,m šE;(Y':[ ۹Sx3*4eI%@ .0V i@& %rW s(W&$њ8-Q?ŲlzڛzW1b,_'tRz#ܝ*nZ*:dk1r#1r<ȱOI0+ ΣcRV!ω"P٬6,۰9-y,N?guev*|8)"}tw=u,bo*^ã;_ݭFo|g1#4shbI9%8(Sʨ>%>fӸ%9 X>^.v'.?MM6G o>+I2:&K/mRvUM.c0s.d+Qj2`iD B.#0rHiR*Q =$t"l?=%heH)Fr kv-0HovOEd 2dﵲlZ9.J5VԚ`898{>ly_͂=t~YzKTwWqQ-3֭x6Gt%x9Ds}t"JxPqFsq<#ly`T2xlH>jQ eIgͦŮH#h5C[ y Ou1ͪޟ=8dO6+#ǛzbŶ5B8!-TQ)X4 QVo1=h8TghPqÅl/jào2$8dX)GvjWgޝ`x7*dQ0m#+P67r/jA Y]:] +eA׹ՆlBߒIT.>fOq0Z.>\қ]} {E젩Ϡ{8ELR& ɓZ^:k۪%ZtwOPIGt^u/wѻqcY!$,sfG|k\TYۖtQNckRf g)[wڦ_6U1Ԍp+'{Av/<0~jUs1^{\*d#K E&>-m>zy$pOZS1LY :Ԛ8Њ'yߢ3[^~eZX-*4uF1[-/?#v{{^$x;!93š!R< :L.JX<֝VjJ"%E1_4rOn3 C5/*b4q؜ EHRQT1k(j>(!YI^b( .%U,<8Y3qޮ~JfffEe/`KDJ5:`ƅ a*ctBmbYs֣ 4Pӯ {!鐅grw*fʫPOl]fVZ8in"EK@hCv/^pF")g'yY5~&qR[{tI[{Y|r{,ӱh^!ʆ|$L_OduH`EC%FNҎ.#²]_QWkI^Rkl|kƓ#BN%V"_Y2@2#LgVw6P!uCg~AFֱ3\0"bɕ׏4謹*B⾷E_ױ[x$#fI!@B,:7Bvɤ.Ϥ0B]~/} snk~93z{479'mly~<ݻi7X4ܴboŐ&Ȓ'iM;MJűXiCRp͙<'2_DI(k᷷+,_O;.ԾkUfn4?`]y{ 7OsLdL*TvG -⪻+!:N<{ B1&{ˋܻ(z`v*_R1Lbsd~<ݾ`[PۻѼ{e xA+p(r^ޞ'Ty>/ux4_̮ˋh׽pW X  P WB :A\YA%y8`pr% W֐lճmXvDW ud\+qqN>ک4_-pe\[ThI@B2\\%BպBF :A\1I  ʥ&\Zn+T)ـL(ؐ`prWVGWW'+!, \`l0B*\Z#+TٷWO+  [ @.%6\ZvT)\"\ˀp `P}cUJ;qWF+9W(W3*w\J=]"R+  P&BB;m+\>;$L\7@!A`Tٟ%&F(фETRM[OhzS2+=cn{}*";0E]f[Ԗvp`Ars43~s>q@"Qʐf`x ʵ<JP{n KM%OgdDPV=R%Vrs Z0Q %0?qDiJVJu;siQ?>zMbgz|~kΚďߔPXA%Mx_HxblS7Jk/![eT 2z}we ߕBjqj (;nB]n}t[w\ӤM͉k%)HEZ'EϮ E%!E@ERйʹi@Y,8bSc2dm&IQDpm[DLuٙ ]|[mjw4y_x>bݯbSCުv|(G[HQN{(¸ |5ZOU:Ǜ#N[n%58-Tˮ8}#PGoY$; .Dp)l &wc7\nXIƙ`D1Oy'ټGi^Wn6^T\lSOnD7uL*wp.c7mSMwnܾ5nRr-#Uo)wiM;gzK=[Ha(hZH%V+˂BR :I\aR"Abn%U-uS0r*DZͦz'vdvNfI2Ɣ:z)u+ u 7|> ,;_)y-?s$cl۸3 *XQ[83I 9++;m4}RC)QMܰ*F#Q͕Q*;g~l ԏuk'Xr jÛf<44O͵?`D+Ttq*puWn `0p=<1}Uj8E\|!R 'Bچ+i!o ʞp4R(k֜++x(BWr:I\iK{Qt;J]\LW;սo J;DW+c!EW(ؒ`pr 1j;PSĕZpe!: 2JCAP)ɀ+ڰFU+#㪝\yd\SqpJ%':-zF : \`im0{S5;@{=tpʼnìVV jCTWZbW+A* \ ʈ r.C :A\I W(X`prWVkh;:A\)j#b`@-o J^[6>tp+m8# `+8y-0V :A\8 pFA &\Z NWYJm@B6 e,yWVTr2t?\EF7r[ VǞN9rU+XȴSٷ ؀}rI}N;F+P+(;Pe6p4ʨn*V`N` 7nja>%ߪ8G mh^:`OBhcӻXo0fR`(F\*'B<Ŷ s= w\Ji\ 4c+ǞN}t*0q(@f&\\)B54 :A\8 ?] `XvSSĕZ@0 W(WD>Wqe$P 3JJ+TĕUh,F `pjU6J3,mz> W}[Vrձ]S+E;g-z,!2 \`nȕƆ+Pk;&W'+4 \`.T0BZ+P+;Pep$Z,i%XR P.ӡ kv*IJBeH}W q P >q*vz〫'4jp!\\LcRvT쀫ĕ2J(@dtcjoV U!:E\ic 3)<\Ztq*02xo"6AL W(WPpj;@%#CtuV<7 p+) W1*g+ѰňXG[ GکGl~JpoS$l0/n]K#|բ$g˘m\HW 6&k4; `=?MZZJG_@ʀ X L*J՚7A&lhb`jDmG~tS-h Qz<2d'Y|ReѶ,fG*OדɋfW e|BߋXzEu,+L[,—[\yTC߫)&?|ֵ"~A-(~6l=گ/6_}[$.P{CLdz% ITnp KsΕr"Բ$KV^԰d.OmNSu*K@Ike4_bvK,WbDbp oIuAt4StDϹ7%2RHCMM1YK},5u5@V/v3\ZꬸCV5go*~N!ڸVkj6Y_u oC qʗUndh-jpʵ§)zx!H<େr8) aX%&<8"LBs2ʙ ʥ**T\;L=I'69vۖ-ԁZh=)z eJSp܁%YCYIYrǼK2/Y~Hf)K(ǰ>a,x-*MY. &7kT!xc?LNC?+> CPSRPqceesBYKn-`&@`hyHQ&Yo@݀GR ԛ> ÿ 5Z@*N@lm{:03_.?dIK#GEj_SUb^Wa"GZ2z)̖Ťܬ6x=c;6/en>d0!.,[3NY˛&=c"Osr~rD}!ܻɍqW ڵ&Y̊`bx֓? Y.e@9ȘS$ ( QGy`_:a}Y`b<yYSG!X`G@P2""&Z \2T d)]En762y*sq]ʡPQMLM1T>D+$:I)CRa=N\DN" 48x`^LħoxF5])Y]{P^H题o$K`G\Du\rp‡|JZ^H$ޭTu*tc ^>VUVXC睄z[9+o/juVk[cR;ĬW逰 ot:/;Jx->ck] l L.RܟF8-'6db eo4[@^ e0ɕg]~7^Ogo&q?kǣɷiSі|v1epEXj\2cs:tZ3P ^z~Lg z9 ?lN2.]܊6[:֎Y݋퀴^i1%~2kDaen&0/dʯzX^r+ln%9ۂj c F0 iQ6Gacf|tkJc,mCK= dp S+CZ MWCJqTQ( Ȣ`;X+\$JGFҳ!6DݗB'c^GC!x^e^t'/a_=P*o"-3Ep>/}}*r\.8v{;.褠0"|.fy ? 槰"~ ff<)@QNʓTZ`W]sGqk[$T06ohU)nVA-m=3{9A {u0l)!@0Hhc<6`7n3wօysw~ \Opt]Lm1^}!$=kPo Sodg9d"D8:0DajDY@D8ЖkǰFW$#Ha"`%C0!)-zgM&ye4zl5ͭH+zuHZ6Pyq7@S)Vd͓d[]#*\e2@w"䩳㗪:nU7U拪(نEɽ]+~PnsSת?+_=hs @t0UnS"#UR O*X&v_~* 4[&6(3B`IlRZӡY%<&F$PĄ)/H[E6:'6Nbx`Tm,uXU[n?j_񜷇<%PY{,8YIy{'rFP*.aKQR@F3Q$!ƈzaO"-Z2A0|Ij䴠D,ēYu<bL6-_TT7!gXQhUKxy5t*LjD.$Y#_/,y1.{ٚZI :O}3xb1&2-9"5uC,jԴ;- 3zw,E8׊ajr%ɃFF!3 O=Oϟ޴]n:C3磎n]T;7Y}6h" 1&Cdg0yUnnOxi|8 "d~e7 .Q^yZf!|ǗSgyvUaU|(Zn"}]-L;b=&Ư>5Frr&Y@ш\) w!)!&ߡ_8n԰JY(G?_4ҟ]7M, ."2vBɞy/Nָ|c"9kZJkEtk;]F~KqlcSaʫf L_h|r:^OܧZԨK#m `^oi[o >q6c9"lUG5ɤxJ*VѴӯe:чc<Ԣ\r :߿𬃢mȜT N \cT|nRKgEH  `L/: j&)SYtFZrXqqE7^^@/:F7x1JAP?F)-Ņ'/>{RО4DPX~K_hô! ^ܬ X^߯_"_} W6S1K߄Yqw e]Aq~_[ē>x`}z&V Z . }5_}Մ…*.$u)h$!šMh ^OjZjpo} ,PlhwWe֗_W@#+5w,/ӶEwyu;L.iY 4oYmX.tAeP0gPcv7[64y1\d ӫ@<ٹ)[kΫmۯ!rFBB,[JL.0"1':A? *NP9KLAY(zd60"icr@D h/("^>dZ=b%ei[ynŰ$׿糞KE24uh5$RKBCpĢhZR9!6TB"gk#Pֱ;`y!s HJhxy󉙆R\.ԻyǩI20,=BqHȕ *7Y)AO4=h(sLidTD*dК V5xzO ,Lۋ'q3iz",r1@Ii ! /0UpX~PRQWD|&/WR?m`2/2 KHv@zwzX4+l!Q$sliqd)kqjqУǤO]W]aÓO0m'<ݦ'K΃l+s>R1p45JK[0QIdWF:U`MA1jveZoAkYؼ`d J>Hް]K+|9[$qgBnbKUHײ5wU10Aby`!4HiqDgBPA@OSn x 4ApauXMwD@$`C`x̰4N+ (O 8D>E P{bS`cȟ$ LP;.p 7ᨘF@ +swS7Ps-MGef<+QhDiT - 3 jQQxD50RxV<##nV6xR%MDV._[sܘl"aV!",+XJ %v(ƅ?\eީO8KvOi{: ΰu SDR J1ug,Íܝ 7ga^ mE-6`M@ZXXv&wt Ww0!cKJ}jv13>d՝_ln*QtKf K9ݱ3wL!hÁ3ʩMs֊+XBBq/a"QS:'3#Y~|]rq\x vtF&n" {oy/CϚW}j-VoѴ➽-eٳ#v BBj4S~=uǸG%kO=D=^kL>cc% *Y8t^>lν/Ʃ#)KB"l̮JYF$ou>Q bCk:u;@a\;G5O81u+Ju YjUW%7 I*UrBu9ZD3tbD3X;g߻aQԚ:;u5p{i[a ڇ˅p8>-,Ӎ[,ctd~x?w.bZBC!QG?TeL,-n0tw^nsQݢQZZQ .յK)ZR)Y]GJ9hk$9x**kV\UngdF]x.WtQ:qvIxu(dY8J_`3UY >q෰[DTi%;;g"NTxxǿvceipݱWA >WZ %ocXp N>+|!ʠ{k5ԊJIiW"'|-ΨH>)c߫궔PU:ᢃl%G&s 1z$J )(M]nrF^ҹDnב?"DB8IU{+5֤"%*m|uV -qr.Z-v]]"P}*:ךګ\L0X$V*c2QhpB&Z޵hi9jUQIui]ɪ?94=u9C8}7ANLsƝџ\ZkK< dʷHhL5Il 32W7>9,V*WaTP1$ gM:P߃Nl͹(ڎq.WESKN\}9j`K64 ꍌYJ7,b錅z^Ric#IOfۀ/9:N'?vw>_8bߜV G;8t/'~(>KzzT',#/?~/iaVET-*'XSɌP2%lj3/sÿ)zr}3'_xcv7b SóV gg{?xlφE.GMavz9yN&9ԗ_4wNNg L^dVt||2Mˊ^MIu8tpzt&j+h<է^rdQc/A?wM&6 Q2$,y?X/=R\繁uqyVȶkIEuK \v/֋=޸TQߣzQ dk EH^r2JH :D+XK֬ѧr=I6\\~wMvD;.,n74+ONjF٠ s|phX!VW.Vz4ZwXh%ݳZI"|P68)b᜛݃ycrq_~"15[vDȐ7wbA՟cڰan@8TYJ[CrN>ya IۡNz$}珳!`llvi|zцl_qw^0ԍtKRYkf{,avC|-o5H+Inh,NyܯzOYjoPvlLHWM iҲ^&0xC+hWj́+jJpդT[ &WW )pդ]jR*WW, +ysUQIպÕ#'hIkV +/IlP&uWhUR-\}1po9h9n?pu7q}Z4MJfpWz WN=I ,UWUIpU•2ΩM+eœ_do \AZk%2[٥dEzznӴ}fpn169ifؖ3>r5dNƼP+j#9!+Gp|rptmI[Bl[SHi<)ÐvN>.oѲS+O; ;ՕǢ.-{B]bN.~j\kπ{2$U\uѧsƪ蔒GLBR,nZe[#X*h!+ #o%"X\LUUȵ:[65I`p6pN@ ",i/<{C99 \pօD2id}GA#d¿Im(VU"+Jld)s@-NTcrS RPrVCe.uo'NZRKuOaBHYdhO!Jbr[&f <*,R]bR))"(a Y*L6x'X!( fh0! +@]0]GU "[*ph{(\`]r 0I`a/#j՘N ),"@&Lv,0:*k:uTU*i^(U (nDP1A_<*J[`"emU{JC3MCGv@8S8k8b.0 ֎pcRa)fIA14;x b\X2`}-XeRN9+%ޜ]Z0J892lܨj , nBKaಁW/u^rCj۽]SlpХV]YHI(bl¿;Hn+ 5bﮁ`7/ j,XVt{ZR㙖URK]tؘn^ɇ)ޘ1iBU3!qM6y-14Ȳ3Dh8Hڧ؄V. dPaUhcwY+cКe4;A7A˵[.EfF18i|1& ؼ(ڡ;IDL&!aV9 L1˫v1B;]. FjAGVew" o xs@ppi776"JpYiV2MJru IhRVjTq40B,GSkAaKh= KSvw/ @nJ.95fy B T b<{%lFp[O AuIBPX,0A%^6^ b.^cF0XX[vzD~"A: 35SC4Ǯnm^!`&#E4Q] !?GxP* Ho2"㬪ZU"$;[]^8 ޵U9zNa"CU`!bYtSEjr6h/?L((=q*Z;? >ٛOv]]~_~ۯy[}Š΃I 1q}[X2?m3>|:mwFø{k!lXû{&qɾHM[YrF`"l3T-ެϪhWs*1Ǧ:]1؆*L ![4ⴗQ|xNuzչ>njَ&CfL#0Y5ʾ9J!UG)6 nRlڰ[>&Fp[ #W,nm&׊^pEj;He ] Vk7w\JqeE<WV Z=w\ʭI: #\Aѱ\\{jJ+WKĕV ; 7D DsTJkW U!螒Ad7Z Ԇ?\ **%W$3wEr}7"WP)!zV(?8& W*X\MSk NSg 򌫧VTA* X~pZ^pEj];HeJiWeW$t]Z? *ЌJktW$8]WݲY(qE*NAb\-WF(tG"28Mn}rLqE*GWKĕ5B |$Vu3wEjqE*]-WΘ`[ɵDWW2*q WlU?"F+Rf+R8\"A `'<|>H%]-Wcz ؊ H 5q" ~pY᭠E28I~28Mn< gZ0WqԪN2J:)D?"J+RkqE*d\-WTB d7 M+ㅹ2`~ 98)jPX& 6tbCs1MSl +6w+M/"q@\/$ƪnpEr}7Z+qE*b\-Wkcz Hy+=.bTȸZ ΪpEc?zMtEjAR8\"ApE;JIWǹTF^ͶD\VJˎpZZ H}tE*#GWKU FHzIjTj^U|dǷ"8+om&8j\'jZ}my3;-NUd\=e"pnpErWP;H20^W*jeeG`mT7"F3w\Aq@\ No$v+{'GWP`3+#\ ׉nARg]J˸Z"PJ]A֦\\{tf2+/\p==hj'^+RqM~28MnD0&*mx˸zJK%Oo(#\z-Rkqݜ1`;VaϸVmc ψy0_~T=^0;aR%{!g}ô4b( RVRl6Jwb:Q}M*{)Vj #H:]+RqE*b\-WFxmy& v< A+Wb[IjWW$^pAT*N+ =f`o|7"z:ptx Ɵ+Ri,jF' v 6 3w\J`0= W$'֋ *7*ZWd?sW$Ww+RkgdTz ~p%Yx;WƦjp`\MR NS9dPNd\=ꥍ)p%2v+}/"n"qH\y}eYSAu|twӎ޾5Bi,:XǬ͌NMFVLI\ͥқ~zUNK:,j}AZ dI62:!r,HpT r0*9w\ʹz\V:RXkd/"nTjŸZ$|[f;veQ~rvrCo%5Kp-\tq7Wނp6ܗ{W5᪾+^IkyHR1 ٕ0tCwGuv:=m-.vc]`cYju˗o{6yA}D\@o?sJiIf|IQsBjr46J22UcԺ=>>/4?YO ] 8z }p+$t{A_7ޮvMnkDtJ@O;|vqse;w}`]%\=ѿ?_s}R/6ZCͨ%=dPjѮ?џ7|MyC_"v|ן"o.U]պ)nU>Vw}d~x?]_:[x(|GC mu_ߒSw|s77}Sf47T.NRL v*,-X XE*VfJNxڦ&6RVa=~~~{n.@tن>jH+IH~0rs6#AB?%R$͡V+߯z(CrDg JtuO=sj 1 ˚,QkΤt%T uf/,N~nM'"\+=->[uFL\!KqMkW/c7,fE E~./j]~@z1?zqx}]7 Ȑe^snq߿bVj5duar7ٴ ˜Yï;xt5KDj/^ίBmƫ)Y,*-O,rϽ!9$U ApcR[xO&E~M`6?ެl7rV;ޅmC?b#sw0 ( `z2\ml>У88X79?ez:ϧuFLX-xftr9:?h>ן3L׃x4 h('Oؓt`a>p{)c[ؔq '_@&~AJ-`d N^MGn 6RBJOY4^VQ!rq`*׊H+9Cf+0YXɾԨ|m~]/9bxKRĈ#d4 y\3HchbP'0 J[83PA2 EP%P#ģ\0(CgPT 8q0,&nxwB AcbK V( OH%`&EP-Qs)hW+*/7I {XPFX-P`%]Y 3{jxs7Aw+'ծ>t2(~ד\N5(gdx_e|$Pahőo7*g;zgt<>_咴r{jaFAS$PΨR`"8|OR$$8GP43 hP;l! 6qx'[l70#!0 ^'mp"+dFS"S -@ pkB%k {c]3}_dN(ʂ2R T:NI*bmtd%D'PB/^۟+4cHIļ'e K v1ZRLB6IT>9CgX2!5b8vK}whyaDaZ,6#~"t7?k6L <*TGAfG-96قG>Ӎ9'r2 fdǴȂy>䛲9SO'9 S~n|u?鲼d#b'zx;t2g꼟BQt rŽL+)G9#jqmQ@Jz!<; rnč8 pgY`;=f*`f;`t^^fƨ'Ρ`YI<UmiH/ziAX%t(*R%V^z[%&y 0k^,g>{mQb=dYQ Y.UK[){YR )ii F I! ARFm:%*E1)S%E&}-Aބ3VןᆗOxy$G8:`y1!BAAn|~nzxr54YMYN{HW]c3y1(of3_n]/c3Tz !c7b{BP$% L_YZqsk%TrrV9ܨŽ-"V]_F%(> nieLW9'ľtt%V)] YrJ#S|+RH*xVҒ8 K:=2ޗYbA zAyo)HY"1Z(b;˝^iTIb9Ye>g>gGh@026)T6J(PAr h:Q ģhs^PeZ j1qv*eU͐EìñzC,'J?#uHc ;;urސ6ŵOʌcR/U믥pOQGV Eh΂ I` F9#QqL`JDIeDCˮPF$O)cb┡`:&YtMJiPJ:Ҟ8=c9RӌBU>/\I6ȸFS{ؽ&|Tw~{CFhXϿp}P.H4B<Jb!q*%8PG $|+ѲI0d"#{Yv\!= 161aAX$)챋c8QCŴPv`PG#!2W(bLY &@P6.W.= I]FR !Ȣ& !a #-Psa1qvÆ/k&fx(~<#Q;Yu$myWj+C A0$HwXH!)j Đ`BQh ;orcS4=MyԂg|PF:2i&4wLsW/¸)YLKmahzmo)ƽ-2saa>~Ib@⣄?\<ժϪ9\41r? K(5{,kbmcW?q}ܼ\1a^t\`c8eYCphKVCAI0KSS.C!,24hIJ$E>&GmH*)n({yIu Ju҈UI 20^L+Mx*}O1qvcg-ӕbos^ߕyԉjn>z2S!,6,b2>Hy"XtDJ(*/TAqM-q^S'+!y\/`\^jƓd[H-"ȡ0uKyIŔh\(GuhR<&<2bV:.yR𠢷XB7{WAOO&˷ڞ¼aIdi~PI㛪6TRSFõ D3hU"HМJ$D/ĀUXY-Ea,/-pNa}ϟ'!dh/_;蕤jFWDw-ނS[Ci^R)&̙pS%8oNw>_ӧ_R,QQI3Uv̕ FU)V%4<ɤl%]^R"ra @"As5@L%~Bl>)1 sqDžtvwk4Bg!Vr~޳yϝH°>~ j˧FL6)'b;GQxȣڹbJ0KPE(1R, \8VgMI^2!osTkd@QѨR K[goY_&y:3uuWz҅͜+zfBR ,tqjNg ZNgJy觳d)Og΢YRq{6,JCviC1 8N3YcL`J&9$%[1U"eٔH˒կ]JLeԂۋQɣվtɔ^S88%RG^K~S# aח%}7X1I -KhQ2Q$Q&L?s§c;J#رnC_Q~0A|Mq~^@\̘Y9L?:Trk6 &T܉H5˹H-bBE aR6*ƜSS" J8A8B dc1nIc"1T QEǖѓ,|BP9*]@H\"4$idZa6j͙DLIǐ<՞J1p.]ĥ5pwqhNPWI?ӭ蛞yX?Mղ~E nWgp[6'8I s)7.(jIXȢ:{MR401p<Yz:q]Hh HHsbY:ߥuA^ ѣh(bX#4AAE˅Ex33YdD! 9a9 ij3!rU1m"b#a+"$K!Y_"Pk}d>`172".j5pa9Hp{ZqOR 'wx!nv|ê=c#lUlrQkAK3G&UPBKk$)4:,q)]- ..W/xӏn.V{5> g͉0Wc|y?8@wTbyջ*-">b6 -/dπ1돮ޗg)g̒n[(o~.L}|j?f Uf>Gj{ߋ߾bgŻMA&ꧥ欱RhKH*Dr28Wρc?f?y&trg/Bo~ȬI͏R"U&kRA-Nzv\9|&ĬQTk&˛AUX׼ VW6b@Tr%dÜcffn t-[Q{Cd8&nGYzfs<1VS<wum,6 dWoIQzMnwd[-nܷ)pJs*;]o5p6+=&{8GKqJ);ysѼz[s\xUU>Xno\b:SnIVPOό8"ӤA4\*T@M0iO15)$06rܣqm;:;(4\i2IZ)9D3/5( 0ϒc|8٭ҙ#q8SPDdFhϐYB0-*$RPye:k -鬢?ndp jy}Z*(!!Hͬ(sBYK<,3LxQ1<8.בc I ׌z >︢B@@QFŒF=%n͵Ed$Xa\jji:;iN}(r;'"WN ,:lR< &'r2EJTr9>t(wxe(;*QxĔˁQA,*#u,W`$QWEfM(]!EzL~WՆb:8i!W0[t.KaTkKAsN<4RIkP> t I#BGl 'NetB?aj]UZwgt%֍fSb]¤I/Q %%LѲ6L.6sK* `.Cw1*[hd6oɝE]th=Kun[m(k@sJ>K,h`[@.*b"uyW*kO۷\R􌹁ps⾹ec=c*>],/?DP obn|{U[?ۛ_׃ &XhO|{mCzZ4V^hFt\nfW?n9}skAw ]A[Ymi%%`ܯM3MJR+jD@ŊQ"γ$ߣT}Td̝SDwٛLwU2rlo}'$VCXd$$w^i{qt폦nhnƞAK<͵GrՀ RrXĨ'*ȕpV|)69f#EHg<s ԀUzetGᄌ>v)+xjYl mz)V|4ˊM?Mha=`.㌹1 ʅsmQX_N-}vJL D?a&ϲҪ@ 1f`냁YC,4y{~77&G^k&OCȭs~ 9JpS 2E8BRHK*)\.fwT'Ad\cZ]VWk9pke(*4iQu M~Q_y߿ 4\-Ff<̘>xӏ,f\5*>LΘ7)8S 6wÛyb4"_׳?zk-o?pg?qm#P#ʣӥQBJTMReb*ΪΪ>R:<̈\>%u` .+.0bHa䎓i7T8%{ > PHV'S.ygҫ\ne5p oF~> {lԺ33_ןJO][an;4T5ژnY͙ ZnO=M9W&sōY ]9rnL қL[g5r6~\۵Z6.N~56z=wxΣ]f=uѡ=Xht| ^=Gfm'mӐ>ke;3k}[J|WFSr%"q%fp9?WbVcw%f(\Еhnl8vTUVcgWJ%;vͰ+ֳ7p'ldWOGӠ=ZŮ{buVJKijۋD?<747fqGOpm18lhNyp0Y kQ9H/D B2^˽s-l˪,./EQA񨟞KӅ0'C~\>)'Cܚ B8br:|!MT3o[my_홍Z}RpK(1ɽ6/Pb;l62`*R덅2aJ*D KW쵝Bn;Ktݦ(>RFO$ ՞R])0?9+bJޞ%!COS- s'Ť.KOc6M@h TRbx2E2iUF*N1}cA+ű;2 s  ?M 3Ů\ة ή2t+n9OH2u**C+Ա zJXBkxU9quybƓx,]7vd! p0g.h! ϴI0 x$m\1\m{*G b19og{,Isp!3|7Gſ"We.C{zrlebӼx{7uxE&948)yr8$AD&*'brXHEQGF9/gEm܌Lȷg;=`AOF6gp>ٜGPΔxY`!Y1*a/çJĔ&Sz%aSңe{ܗlYM5?j]{bI3 ڻ^L`^h|;tuOߺgkGtyTv`"g֫g*7%L H%jGJ&b7-.BJ1WB:O_|/w)hD:sI"jV2Zz`=xp {}K(w/Hs_܌"n{ڲ+GꄖpTY/y!cUW5:d*THN-*Q0gHrp2ĀxRC#89].#pkБJL#9l8qה{xl/jbZ n][',_*u_ *]'ҤJe2'b *)~iްKV7>ZoX \-sK5 *™uK/XӖqwI&SIUJkɘ* 'W{ԪΌ@pJp) mX`Wf! X JZu["$=J LAqR<"\!a:;[NPl~ ٔWd% WۨK0qsymJ'om[L8Wkϓ[XŦŜpo얩-~zN:Gf{Ӭ's7~Md{,ͼJM͞nv}= NKLe$C^>uBφn/&?aQ}{Qh* @ [J<+H%uImki\[ߨR'i'f[CΠ{z\V HCp+B t2:k8!32%M9 USs..5ބf4s6n3j0j1T4~W}.1+Om$-Q8/)DR& &&.!(^"D"Znȸ `8`Eo$ 11-7,=K.1dhwVpp nUY$+'^Bn=4qg/\+&P^8J'ϵ#KEkyVsEbtDB2UȞ/ƿ^Rio!Gc s`4aH *Jh y,m" W-KުK7ObM# 3܍ntyʚ7`3m3!hS 6aFmFͰ5!GeIDÆ⫇=Cë (W@T{_=(K&!§k(5n} TMaU4ݺS~+V|p2{cJ qPדe<=|?+4H0~猗˕&T 2LK: اȘ5 OߟtaǷ˟6.ߋhm_RFJ0PS2b8\(4$5nFIvr5A[ `<|tFw5&eZŴȳOFcR,Lddsv[̦jֵ`\EtM%zKRBOnuU'#d/Z-jϥ`:<1lG^t[ zt⼞Y{yQ:]Uup ư.LSx`3c)Z#XPΐiC>ȝ(TD ]uV::-'Qm³U8Tf'N6J£~@8X+BT4 *bN2^>w^y*j*j~;#ڣ b4zA()jkPDv rG6v6l*1&(G)t3慰 \џΖzN]/H 3|/,t+),̼V:q\WgxPħ:'Y8ό'x66M$j]0[(Y6w)msv=]xҁVY ȲHv$v@dCDXgwg2H+0q"dNzLLEN1u$e|rZYW;> ZB^KRI8鼎,HI 'T F#!%ٳ@,pR-!D"J`Ψ2/ [YKmXp^9VgP+ގz;{W|"yrNA{$i,(-#[ 3қ0JH[tI* NY]uF;]]^ oGmtȒԴAI 2U9t^#B9F8;nU%/zr6|cUid4 .<3hSgmJ+R@ g#=c9w8Bv1wo z7)5h} RCQg2.]Rv2 HMNlb=X𛤶fWX&kR2͸àDIXh9k:<~Ȯodwc:9 %r)VH0ƕA,#r{a xp}q~ 쾺~zօ6»7! #$L&f-`4 "^b8h,(lMQ״BSbu&m%El[zD7umr~oVBqdP΄ۭ+]̛ Уt{{sN ne$Ӭce@5dc$&c \:u$a G=WÇ&xFuy,eJh,2Ѭ 28،E`9Dd,y1/5 6xT^5&`ߴ0xYbkSg7qmܾi Z),Fx=NئD,UW2ͫ蕿䧻LG1ǟZ Ï$&K~˕j;jg[69_+3Lʧ J-g[Z{g^]NͣW5Pw0+ fI~4ݬ~7=ciZ6Jtu9aR7u`:eݻf_~\?ߍ&S{I0?{F俊ض(ep;d nQ}%$'_[q˲Mr8j,*Fݖv6kS}2N۸7"nOYGŐ <-q_^s V\M-#r\Ijxm2ӀR`CâF!E5Pj;[]X7#LIhf6gTٱrҀ^LR4**-g޹ʙtRR{D.K(![o uBBdɱ}keÚw֚fsL{^Uomw{"8B.UIRbB)&PARKˈ<#~2\ DQ"sIhKN{ :XѸ) Q$&CTt!@OZ"~ UNCXmuW1EVLԑ+/<+͜K>0-8FLbJng1,??ā۟jkFhgdDC\@# z0 =iˡ4s3R]A5 *A踰.8,ʂmP d(#aR{Q"r}ta gυ Gh:g };sXDp1~jUqKFNqiTa~#⅙˶}tY=i]$0]":sykqK@3L읲2O|o 5xc,]$ Gt펙6Pr6nMn:9Jn8Y9׿WoS0hڣc<V,WVf~89ZJpJ^ܦ ˉ=3b=qF{!m!8VN}teF֣y9_'ߝ- n F ~'$be/9&?E>9x{' 07Խ#yaD0JsUc)8:jU+;t2>_-ḅŦɩelI7W\:/G2"tKgs^yTM3;0(Mۭ~:$}44|uϴ=+'lDbaIWFFI1?ⲋ/ukG Sĩx4! ɟ~?>}h?~x; De83&b|C Skho5jz[ {q97VXB %k6F*[Dxܢ7A ̄d3YtFyOȥ̣])#0quut5]WcU^  M*#B r D暠kP"D=nYɆmV:uPzWTuJ>=ԧJtr #2M ;2Rcd:g TiVґ%@ ܄3ցWo8ӳqj|QfF>dveD{BFWM>wӤo,rݜ r{ -B}zKV-SIO!&4^ m!**y!]( Toq'\wY{U=cs:_vOJ/dUye.Em}H hgF],d(4$U"tjkrJ9V<ڔ%z0hI<&3xIs"TmX5c=RMVB]YA>.\ST8+ȸ'Pu4?x޿]ƏO+9ZgITD)G k!!s6hHMR]ʻboe^72-C1|L (jSj!F 24mǬ9bf9k0K&hjܱ+Z[Z{@kRjhbr  Zfȶxu3#f*d!2@VthH$dQ 2& !e;Q.VakԯŸ+W# qЈdp$d0] eX!|PL8gGXN,XU#:0 HNgLD.z&-F 1%-Ho8VY#V#g\H:^7K%_g5.^t7q% =s2YUEa3J 69^DCtd qЋwEV{5Ჿ;GPa$75?/#-/RFAn7D?>SRp)Uw4/\ALytR!1̼v1< J4rn2tPǡPwG+Z`$<̓ "dBfx ɚqBʞuk#- *iDB^G<&F 1:8PQwژE vj܎IC7W4WR@Ş#QSr=52<9Oa(غw% 3AR]<ջYNOXV:q#6nD- wmKӏ7>׋7+ܰ&J7cs:~>/uykDIk60P"ճ6$Y&$p@o%r&}f\,_]]`꒗n+F{ŜĬfubjqr6,ܞ2e'P 椕N`H&BjpTo$/w\,hL4GKt(;oօ WOT 3&!:ke& g:̘~0s>=ȣw  )G<ȔN5}mbӕgw73q+2 r4Z:@`2Fg$g!(!J`WJg $ 1T<'LNYtvIj ĐFf/\*6p!q]<w!B,Sa2׎EN/Y/ZK7Lx]y_0 >%n> cw\@̀WZ1.>gZ8c!vy2ؾW7)W]7~OLKqD'L>:?OI)if0rx ]GL:-ɛB?ida@h5Z21}Rs)|,[}Ƣ~eG,+1w2ŵvׯ!~ڛym.Znt܅ޕ "76/b[9ozl{D jPvuӵ=-d~K6 QؼmԲwiy/=_jfVO7{. ox.;eIOzC5kjmaӛ66O9np<ˍr<5j='aq *T-eI2 Iەmrfk W;3՟ }F~ܾCvzv;Y6j>u^?E-Rp-7,b^G?w~_/_ zQ?}gq6;q{MnfhT.ꍨ[ql&t6H8-_^%_hqri٧[BD ~J\IB9+lmVזYWn-4Mc}{7z y/ 7"O C΍5k"dYl2Nc{hW;_j7]3J>Țqw-tv}okm76})1|(A#r |c?OAEh, xrhR>28$S<6K 0SILg4^3hMcN #A2A2$$I곘 -+%&RO$ )pq˧;ԎKc5NjR^Y)8XgTHBR8( )IؐbjRXGXB Fԝ@Qm(N*m{"+Jї]Ѓe`W=&z',JPqe E)3(ǁ->i]YsG+z]7YŮVW:,{f("1 $ jt 6HH–><ɢ?mʶ4K=\EITÊGDZ\\h.B7/"ObX#MWU,!0n0^U6-*AI\]pc-1qYOk}h\ރmTD呶L긟BY$8-Lsj$Y~ FaA='Eac䦟yM,'ROΩAɧ. MBPBKk$)4:,q!~ .HJd  ͐xr]./OZuZ=CYK%<_#b(z5i X,{g{|IK: !qfx+;G".̚m6ow6iJ>ŒZH@]7~Zi{ZO2PY-%KjJ/erT\A>B)Ğ~I*O?GW_OkOm7hW᭥#C+!ZdMQg-(gzts7h%9;"ZU!W,Ab .Y{zڬeaO9;-^*y~4^Ӹmܮ:$AM|oTbms]|CVM:l(imeED뗩qgnⲇ^Ѷdl4Uk$϶M&}6ѷ6;iruu/.Uxv}!}Z/䶥 3^oiA02k~U /LSQŘWsp 8›UTRj؝iON'щÛOu6p9꽯jGkT n,F78>k EL3٧wPn7t__At>u+shW)pJs*: neX3i\[pˣ~%y/=J)F\hyU\dڒ6ŚTU̝M7>J_n(+YC̈#Rhz`Dp2,8s`cf;:mCwktNYo/#(@mT@"4 `' 6QZ2}1_QpQ 73_4vBDt"Kt*9JƈE(o(iQW£4F&\:d  Աs/G̚nQe>]Un} *L.KaTkKAsN<4RIkP> t I#eM|.3glQ jC뉌.4MFWQbϝ p{a/w%\e]\+m"%ܗYUZSs)}¨ܦ7Y{C\IH)x(e׮K>vրz1"|4X =h6]TDby[}.zΘ7?P+,+Sp j9*q+)m~m~ ? Z0M\~yhC~Z4o4On]{? kM[ Zh+ծlH뉐؝!rj~I Q}i2Kϙ+%LO`e<u+WN? ͬӔD0Q̓#!H\/DZqSEh.]d^RaY _o f2mV7Mpgj<4ⱳVٻ?҆K'fϜNy|p*DMP*Juü5zx*׳beZyDX_08ӸYBK |{oƾDY>${Sh9?Aص7eZ_xV4'?YڍL.E/!s&6\mq}ۇݎqw>dЅ3BFǑUsgg[{gLͣg^ܮx!׀΃*]6ڟNy,4^J7.&dCSJj6N_rOYot|nqYe&x*|O߿/*m2JR ^ތhev>>*dVݨȆQQ|J"Omu#:.e VJP*rԁK&h,#e>n]VDZ%-(TvI*%%zI8DF:H(Ip|P y@A~e&qf@/|;Fs)$8nVCwvS%{CUH%yU>*wOADʥD"FR!7CD MꪤuM֥WTKS}yrp=i%qU:ϕf4DDjKᐲ `=N &МM4'EBLRR$$E>&GmH*)nV[=2Υ :Gu&AU  5=Оi O]Άs7E ].I/>͐kMqŏ @/#P$mqO1{y1Yxj[.hHypSEj(0@;2589͝rfY]a}]P6p:JYsA SokX>(I(邎sSυrT'p@)c3(#n(E68Ւ3'=ݛK-S0 L'2a$Jѹ }~(\9LRHJ#/I`,i@3I {Ӽ0U$ȣT B %y#óϲ|u^M}@9JX3*`(K ]x0t${ao?n;];)!jxJ9(As rŽ.љVSx }4B1xq}["V SLʣ#:b\i;n酷wvp6 )UE+U^%\ou{,:99<^TRͨ|(lvش~{bKn5MlOPW 9FbiFf_+wO;/&Wl`sp{OW.8?L*s7-m!m=qړ{h놵w#Yk70`X(\Yf>{u^I6WEcU %wQMO(R=Fռ LTr~Z0bC>%q$|?$h+*DH*m$GQ&K}JFC=t)+cM}).ZN2?{F\OIcQ wdc"_/ˢUT%jHJjJ4d˜fOMOS. %Eu΢6Vky/w#e*˜pS`gjJךoSwZ˽)r׮b^'I#3p<-1͜G͜˫B9_i&&q W,7ix[ZV'WiHKS2 m")D֖i2$=Ft:g XiV a(R@{Yפ`*t?(4:=b͝R-V{%Sϖ]ei\ n LSigkKܺtA Łpoy˓dALG2 j!A!U\.ayZ!AJ̉ls\`p mRr)ie[-E\ID.2MdhHޡt6^ʻr{-dK7u `'9s8cV%FPKH^*F1j `%9&%$*jfP`&I2-վ/D&Wl٬uh&dIc-_myƝ79XA3d#,s6'!=4Q!+ʼnVie(^HQ`Sj"d4o,fu)`lmrZlFl?K.澠vٱ/,صZedLE˱f5 s"kL 5L_i][6+m\Vq ,CE aML,hAcd8wÜT'\e<&fƧ,*ޢ9I 6"/5 IRaŝaD<\]@M^Og[Fn-\\dw !f>]"M`(u: q* SO5JFr1DA>'e)I/^ OV?/:IAJ 2IIÔc'/*{LZbF ʺD'rh3'c1_GTdD^#j}u nKd{˺5԰M?}a7oj}-%I^*i\ŠwqװƐ ~c[R123$7]"\; ۿ?+&Oi(FѿoT_4~.1'yZyKN]d>KJnǔZҋV2$IVrS 0l]sc'~:81͘BDKOuڪ޵η-fX~Z$ҵPZ Zl>2vh ]`W^*}g}r1Gџ~T~sQ,>ͦɇ#GQ/'QS ģZO.'Nnvbbm5в~rc$4XDʘX9i.fɨȏs-2ec$ G:  +\lBu=9yن1B@RjTg?hg=dzs`k pO1+u[,RJ=-óEz)lUpJpX&MS*R. p~Jrä9 *}0pUEy(pErrWC#W``HZI pJqVmUZFBtaWGn2-Ӓ.OUJI٥Es,ۭvsqOÏW266^X!drZyk{O3Ҿ4ަտots{|tc" MT$ش)"R:Z&| yEjx#5 oet%$|[hz}і6?M./+&_ t_WG.ٿǗ#k3Ιn'cbyn Em&9^ι-oWySO0%K$2 , iY@RJ>x1,ZI{2嬜 OGB)Bd4A@ke;/oua|V}) b(J1P ldJGК;&7)k4F͕Y$BJD"Jΐ5 9ZkCF&\K1DQh‡`Hk̼mpZCI0>;8ڹI_4Epyhvӂ"h:\&fWUBV9-io38bUB=MV*Y<#'u5 r&($/9 HijQjp}J>%"N`䶔4rK{mBRE>}(4*R2&"%dRs\~o Khyl._^kqv. ×JA/~׀;/1JW=g\Fg P>jda@L=5S3EZ p)\(U}Ƃ q~e,+&Y9 R%b(o^[Y^nE2^X2n޾z\BmZYtZu~ 1'bo)SZ_Z7뻧Ἐnz|׏f!l墮ݮޕoo+آ祖CI=oqu/y QϮ w\;<ԭaj.t3K:Omuҿ3W)znoJӸnlO}pS;O9$ɻy!A k3KEՑ 3,s6 FvQڧ-'֎_9K= ]yբzj[K/RVvyYs5C[pA䆫 ,JˤɐAPƨpNgϥ<Ti@ B4aW,I"$ӓQr1 9I @ 0)ȣ``rHH9 ;ύ R Rd܅a"G? @9+jJ|8b'x|Z{;~vew5uq<W4L0dUl@+) &NG!WE`JbRl4Lx*x2;Bπ3Ġմ~SePiѣ5 b mQFQmn7 ^JR / 5Z.tqgoƟ=5aB勼e0* K<ЙY|">0qWQEۖ_i+={lN0x įG[v%vTH-i'ćD~=[hoSDi=mp3)EG/IQ"nԒ0l=g݋mxV&a\oLQ7lnQO4ڥRݳ!*Ux61,nX)Us;60~q`x/0 $HzU@Oi}|&67l|7SV]c6PLdQ.{:L{Rɘ^O_'oZsaMƢxn#;_a}h hxaCR)Y\N9һ As1g+XחUTOj9՘yZ;J( #KI`E候$Mƌ7vJ')d=2B9Y} <8m$ ZMFhнoIҶJC3> }D=Ϲzris׋=v|(t͞sLFy֡Ph.@K&a6=Sibީ$lr{}[M=;AZ i>i yځ);fZ.bT6 ]o9W|;`&Yd rv,EYv${2Wl=e;6eNc["b=T B`]՟T΂(^HfdNJVd(Gz*i#1g9^&1wF K ĉLM . 9{{V)Ud`!1S ߞ /-#\22pxK!,C},SၱUV=1YD jx;i%}Y(w(%aObjT(Ij8(:kw:ceZw/|36 6E6b2T8/Dģxت6HG3taG'ɬn)׷[UK@,`G 6Q Vϊ/sqb~?z i`:OP\/~EՋy|IՋ6ޝ^}g:8ދ`n٧z}Ƌh^w}v/R_JB^ (Y|=_ղy*V`4季ꆅ;"w_*O,_訿9>NfsOyZ4 {yK<>3bnκ"hzh}ϐ>2/j*Op{z3Fs.9+[:` >׌ ]>wGZ9ʒ{_m*9#&ˠj%2,:$mPNݦc"la^߈Aizn Sfi]T%Y`*RZ˜5&H"hi MpNX?avA5 C݃|DZ9پ7%Oe5a_1q{Et;|PrmdWwiwnxYonw%#u~7oҪ6ݶX#CLB$d"Y6]F2a&JIZP/&{Xȗ_鬮v2ν4<`l t]:BYغ:gцHf־o:N=Q0[m,U4R{jga8e]4,յɲRgE J;ࡋԧ?U"Cb_ ,DNDl!J*S0 u 1 m==N}kZpukeF`\oFͥ *wG0QԡhIh̨mE]MBIѢ0D38RU`QJ%JIoQ/ݮ?1sd3lIs@"IWzW@9"6Lѧ]޿wVplKD{ |830y}Aޣ@lɜ+a6 Z[CI wh,تuOf{h*F|%[lW/Tme| >8 cn>Uҷb &f󬶅vf;&5`!uXś-/{¾O}]/Kac8MR_a~ {dJR1Ӷ%-V<1H[(P:{xIR  DFSp_I8;o9m\j``< #@%WE]՟*2e"h;mvIbCPTd]}zM>'z| >C6ٹ|Me dҐP@?EAØ8eHZ K[U G瓎 a&em>u#K"՞[sKiϮ&Փl0r\L ;(IVbH2*A#] 8`A&Ax?XF]R7K( Z) (?;jk sJ2b!B>U:1zh&WiP4T^I !,QX;I^*׺ 82N°ELÈ[! U{/g}ø?ssoNUЪ(Ÿg{:OdG_rThJKFV^ҧzSQH/f6ߋ.+枼"y{y %szfָޑ́J B=N OwfǬvhŵͼU%SP $xs]A *yI},=(bF|r{| ҎiJ n$#isHQ1j0Yaax*֝p2>].l&g/׏Qg-%l}r8M'3^eT=rG'W>7cޏ~>$?K5KtYWW&a[anE'y_N- gx׺uNLS1d~y>׿X{?}Û|׼1Zm R{˽|5C+jho948}Abo1V-fܿ\W2qAUcq7/c=}Medc\k#]v\7 ;7J 5, acvE)2G5m  #X`0.]/>la,p{JdѤr(G=bk%2(d";49ٮҭxΓopiҫq81LJkPOL&Xy2=1+F=Z'&=1Þ\{pLFAZ9m)JZXjDjo#oJId!lpNK&GkK97 *Uy7eXJ.:[NDD"*}ZwOOȡ5!/Z/awvÓy8[__*Pj[tC,TH,Zjޙ '$9tbpIF0F@gK%FQ=\. J.9csQ@ٕPXI֌ȹY3*ta3θ.ƺ ppEQIEk/nc2gt<_ ]g\ccT)ȀJ)F KI0SdYm6|*&joٶQ+8({!+1*PUm#k&Av'$*"Y׺r+rnpϙۢqǶhm`-_3Bd+ _P+IbPi6['0b(m{6l!32dYd  8RM!t'6ɆX6#f}n=U1nFlwՈFF4 }<:m5"iB4M8CJ dKB8\j5UmLFdѳժT=sLNJdJlI+m!XS pksn_ud8µMf\rWE7A/^EH_" HE)SE h- u kbFfqG}x>⣅>\<T*gNgt*Oҹ)@KY&^3.>uQ`!Kst\j2 YjO˹;3>xgzq_ iwJu0ۃ`w1}@*RqN/u|8o_R*nCI}$^Y}_61t@%e@dÔc2rO^Td5`F.d uD'k9Lh3Ylc 1Hzf:k#j{3Bz# (_B>[nrͭ[coz$[sDlteجYGQW}-&_Hr[C)R u # QE}E;4JvZA ZUy{R?g^q+PcK" 9+:f+:KBPjOLNۗdt?|bqU;8|b}μc~cxPd]ã\eX9ljIATGIs-Z d^C H#H; IWل*80{r 1B@Pjv]WYo@']^W ~'_2aO14+gyTb`.p!WK)1\K *-k%U!\NEB.KQWL3]]*mͨ+'?z\+i˨5pu2*Q/PW}_=s~AlŨ+"W0uU. 4qHGf_[V͓8;J [5;@i;6] S-wԑȤV^'A9OzɭiuOʵIBKg jctC6$oC6$oC6$oC6$oC6$oC6$oCu$|!wwRp˭mEZV%kEZQV%k`VI(Y+J֊d(Y+JLCL*z.([VJ\L@!K j5-Pвl,H<ɍu&q7caW`WƐ9~ :&wYɹz@IR3(iO^Ds%IzUr?Z+`6B;i%&lm:H"wEEP2E@uQ '8^M]^%ڿLz*ѠniQ)ؘ7 F$u6&i&$2 )M _oϥcd5ثc$1r1> ^uVw(Fd1nb$I]JFZ^_Wk7q[o6]{r1~\a,XZ8Jx}pشs״/ir2a%2czْ20ؘo{4wP)AJ!!Q(&"pN7)k4F6T DR\(],+ :ˬTA=Gkm{CцAQz m ]ƅwSۻa"~:2;`ޫfgJnCn&x_%-d@}'=;2K\6{OiSδx XPP<ƓDH9d KT,g:c-J}J>V51;Ue'^A &ԙ{*:Q:j'F%XreS9;r;v%S8|.qwl7V0ݲ٫~IBċr5|֘s*-6wi0TޝM/x_-LgeTLӨuKyͫn^zՙɁV L .H3Of\̲q;+99&$5)"QJ,RyOň&sȑCBW#~B=rL>*ݸIzmmJz0H_ޥ/]!6&;Y2p >vmLFҦ%qWn]6O]fl L7`4n6z7-.lظsYZMqϣ ˼zC/jp+aTpE]ZL JEZkZt\AJM#F(Wz?%K>|!6-G áCۈzQT۳!>Y^ċOA_ 7erL$iCaH+K[(Y*Z#.lAۺ!IS/W{۱78$%}r7NgGYZFac RWvlFzFYp I9oRW-:WΑ|t?ǒo{ͤ][<ϣI ?wyЬ]7;L;e6oӺny5oO斻,b֯{"Vњ)ҬRd3i(.܍'tBiƷ=V _HMok,L2B,kf-f,-s⪘ Dr1ݼ׻d#< *^6w/UNOfJ|;ħ8#ɻ|%jxR9 u~ukO7퟈%;x=,&oʓכѓΏi:j# YF$>%t;Fh5rESgo[:_F(_K/BGgnf^QښIXWg!P^|}yL?/WOTD̘cJbgJ-(tQf3eMN%Q \NBP,KEԈ"Ya“aHMic&5Ew's,  Y ZFwȶ2K(kV5S+]7?I =+>OBMN!#Elj)'Ofɋ_)a:Vi8j<,Ujo1A&v xcgc"N:Q&-sOdM \B1K&{!WMED, _-Wte9AL'UYSlz07A:qȾa*K_` z}{ C(,KPzCN怄69؄4rJ`/O I%;K@'΃pLWB[}*9z빃WVR;2h,n 8r^ˑ'hKK+.1u^؆$$&8)7@^rPO:DLxذީ:^0& !#dg9^8CveB-c ZVH[]r-Ě@e }絾J+wf6طͿ9#XMȆI=0 N*8$ɒW]$|\hPy|B?lD$"A'e-COF# AA֖UKesVBG*{kQct{%q.g02cx@k#yIxN ͼDOrZgV)Y`սaT{~K]OBh gՁg&lq`-/:+:rːޕ:i+F/ɷ OIXaX99$,.lzlh_mȲWwxbe(b-#Cr3$gq:~|v%-V]4~[Kսؖ5k5q.n^\゚mBS[_mfd 7n[g?y9,/Ҹ%A`7|Oo/ǃѢD-Λx4Knf ^{2m㋣kcb1.{s2S߉]%NΆyq{R6J}R%*wߦKVC?*}\H:&~coZQ@}QzpVڤ|wWSؾ.m}':&ZA RAj~JR,rHs`IC+e\/b{}̛IJο] Sܫ7,""RrM)x@5Eǧ0QꝐC2v{q~k#y6D-*VtJ6*MDRs65*}PI(.R2I`#15ZNacdžڙ8k365cy\rB;ӓMiI=/mP2س:}wb6A@75I' n9\$׈' !}@\FG.Z-UVA1x%A'1mTBuǛMA2h&d9R79P5ْ/ R]#cg܏qΰ3 c,,zeEe-wfm'8N>7%Zg5^a#$d8 ۡI:G[ DE0Tg')0&{*0"Lط\VJEDc;FĹ(ڝiǮ6P`xIj@!jzJR"cr%3[a*clha߮D)/FM`5(b/:b65)HHLRak l|׉:~upձ"<9X\_y6AG ^0:h Y2-9I@^]Ȣ52PxpPT"[O2dm yiKL1PQLRDଵ / s?wf_Co$oL-Gͣ~->Cb:Ygq7_|A1 Y"Sl&%p:ml_~1#'| sZ<7;aJn-ta[Ki;Ů_Wzr/Yi,zYLz\kT쳐*9I8 CN`B_BNZEoSz|;:m]v.B{paȅAi܅Y}vٟ%4{Y}l(e1>*݁^rv7o~Yn3{SNrϣ\X!h02"ƻI:%ku$lR Q'jb”lUBwۜBӴ1s)EBed0*xm@ԙR^{7PK2_!.ʏLn>l %e0˂JZVydu&3EHSn|Eeoxy/MQ0((5U"+9'ɀ yPru ͶxAl#GB6REfo@FPu3 p2KIdCz!mt@ZRGم6h%1o!bdN*KK/ӈ5/"Ke&YXҟsOQhI am؟|uO8O{gqf,wꛪ9L}Lzݗ.'Sa Lx:sx(ԇ wMjfsu݁=%$@!ѱ!Dx'|G)Q%b*`9\.r'LSWq[.y@c;;:X(U|SFsWSJ#5X՗`$h jmΛŷ1sˎV+*2zz4wo6u=y1: ( &8/u&SwtXS>`4{܋6?߼W_yחx-4ǚǻE"Q}_4?^iij[4V4.{=5U. l7B,3h@۾z~T1sE%1P&cRdaBEʬ%3G":=H^Af)Q; t7.:,qxSBORcs[G9{ɸ2%%'LA@p$O/w%3(m|2Ԧ;;|7peIZXAW*.}*]*%\=CrD` , \JkN>dѩ' &zbzD{W҈݂+tԴGpUjb,ݾUK#\I~4 \UqioUv\pR*zpZY)X`îZ \UiUY՛So{ٕ5]4uWg_pX{HN2gpo(myWA5{,f|ra5X!Np~؟Lƫa)Z ][Jip/s˖-[4ho fwHJL.06d ܽYoz_-b`zkU#v@r%s}`XZp7a3,T)oş'osg2;y*l#$77 OF(-b @)B2$ YG6qWOӹH)ds!, hӉ7/f,>Qʿ-UdpP(9m 1"8Q&cT BAꕰ ۷)' lF\0C>" mRKm|IƄf T1"_5K`q`o`TA))HK.b7i/}eHhS>{"[Uīsf V`x0ȓ@ )%/;N2AR<CPEg#TmT1d¨$,2h ̤tf3((αMjaͺt43Oc9fё (KIUٺk n2P}"ܣʬY0]N-]dX)N dkJ"kLBz !*FN!fh*>bM qIG/ͥqZ@l!=^]9Sz5GQ QɆ\3Dn}lUE0"H&}qNd^Jij- :+ \qяQg]xtXw^b ~J#-.⩌jFdFfveBCO  QA\R}2Qg(M**o ,l*!l+u zZE[PB]Q[؁pYV ++5 e2aݘom0S{"f#nC3!vD `L `iƗ):&|B(6*9mt~.V!joSCAV(J8I9n#XT5 LڲFxw(Ez@?"}_0NMFAꕶebEH]Vkj 32FrTDlZ"5ȲQ"Z PZM 6A j"@H v/UTyOTECktnPƠI 0؛ʀm+fKQED$' GcEIJB6}fؾQFiҬdCҕJU%Uh (yCs ~3XQ|I) >@E&rZ5ȼ P>xmBV5 tq3XG4/! O͗Ud 2inuG-#2p ?z,:?jPJ4%1VwV40LG1юk ) e*2bx(%'6CܖSBEk .IҎy r.!z-x t-fM` &$cNҏX-o_q_}Y_N%Z('2-jϛ-;e/H^PglCF}srSq\ >_Ѹlxf*w_ Zo4#ۘ`~U4ErϊV_-]e_䊣?#ӛAkzqgĿd¸fomvqR~{vO'$P-nK/Έo;o˫ٸXmaoي|ϊ&i"BV~suV}:_<-rL on&7qmtT%mn"Ҫe U b X|GjW#Op>R˥?l/z+KGM F:DoPBPɶHGq\vNˢspuq~26|blxwfme}V=4cz٠ggWCIG lHdu3S[;okڸCI_)RFB<]x*-i7so>؟ 4-:7l8PpRH-蓂l]:ȲkΓ&DS|St|jy/ZN55x?<#8Eޙ |DLn4k^4F^g6_1Q8K2#ZjK򍷍1y!:0S.>q D麜p4蕝*~[ހwS77߰8Rݹ"Gtr42|6,tjB[EYN-OӶ)H_eBI1_Ŧ7kZSp_,.8>?]tp} b+]1yfџ.?gl*) ҥGˎ؍uݪzN'5w3: 񶗷E^kgp/?Qݹ}rQR:ھVn/\zپDh필9޾tۗZDW8n pftE(=!ҕAĞ`d7tEpc腮V)=w"F3] ]Y%mI]`@P*C+["C+ӡ#`QWts!jtE(`:@ Zi:++52BWΞ%O$]E,:ݨ+rJD[v?FH p ;UbFVn(v߁[h툮ؚ~f  _J'xt ]ZJI*伣+e8#J̆.<Jh/N $] ,9Ή`++򡫄><$wut% x(#Jʆ\k\[JW %S+En:h*xW bm+@Q]"]i"1'&eCW .&!er+g(A"/kqW2|C=T{=m;R֠+աM)5#JΆ*Jhj;]%m[iJm 45=a$mPxs5v?M&/j?3aw?J~v._l׃~2C&T>I1,Ƅgð 0lBػΰ`1 2,ek]`q\ ]%P򶝂Փc<+LE>tlZNWW::L\T 0W\*խ%t)ҕz%Ɇ\r+@uh=hN$gB4wǻIUBKi*3O+ ]%:+@[RJ(E2xtf(eUB~J(v%RGW+gӳz_ .>!в#]C)[v?.AWͦǂĴ\A'0e2Ԣ*M\A-[cq=HuQ9_|ݦ*ĺ[$DiR,V>ce2gњkO꘩ԙi= AkWz.jlZTI (o9fV/ ugGLx*squەYM>Spl=w@- n'ˡ/H?Bl `hRb#"ᷠyߥe)Wp >;(ߊDdv&\~4zHp} pYm+y_xk M2w*JwPEQҏev4Rm5ɛ`V뗛l؟a̜`NQwԙ#DQ8Nm`NKAED^1Ӂ*c^B$\ ʦa3&n>!\c"{&C')1 F,s$#7TJ{d8SǏMlٺ}`b9XTiF͇ï IpyH@8 46K- Ybd ukAB^Kp뼍t/^L& }Ui.g;]#{bB=ǿLz (7qeyo:3%jmlMtҾvS*${ .7*`<Ygb#$Jty<3慨=wO޶"vkG:}k?hJ=\0½W)-(Q4; a{Dh ukHXG"\JlB Oc*5(K)µaނkY S;m5 ~=H>UV!É/U}?}cag7%ٍӣo'َfݻw>KzmS@"R9" $1.v}\ȼbfK3&!_H9"x@Q* J3K@G$^k0(0܏a:UQYot}̟c&N(N8 Epo7\HFuO;:ki$ *$p0㐶A{B"TLi/ CԤˮ#5ۑ _=.coad<+QhDiT ~v+f"rFFaG$^o _| ܈8u#zn}VI؍n͜ |S AWgÑ/'[$_ɗbmaKzW`ָOS7ޕ=;~R>{ xSZnG{߻%' JA)n`piGWhrFzW  ۊN[ =kpd\. dGa\ܨx&ڤ]zSSPɪf_JQK9%p݂TGߗ0V\f]HwD{i,~-XxQuζ_ť:EYT_le mrQ0MՎ{Юؙsܕ ٝ ';>'GA ZL#diA:,z^<M5['ipE:K]n4zi z [idagwgPɆldsNe_Λ :ty[36;n:E3m0I!|#9M$xYi(uHNsG5"尗R\>%)$ `h՞}:FY>b%>C] /S^n׷rħ,G~u,{o4I}KC|(\AwPwol?'_;|w-ޤmc cvIOpL@C^_˟yy#>Ƥ+bi*8$jUm=F;ٍ돚q[7t-̨Õ &\;mdTR)nV cwGDQm-9" ,pB{p$Ug]`' ),OpTǀM^ jhgo AHRV-zg bZF &@FS,acp[tt[q:ݩ| >d:[lFHy@0}bʂ\証 &6 Б}`b50$5ILX@9N`@&:$S{+%Row1ן[f} u^e1;9hW)T]K;֐Zo5nu;䢸Vt$WϾRڴwuy.G 4{.D/tgQ(vJ =HPCQ9BP2zEi[?{GaJdQp9!E-ŗXt3_bCnjF2%6^{bW>~Ub@OIЅ\)%cC^PZ;U@72vadUaaT,MXX=ƌB\\݀9Ϳ|_ #vMH( )ha k I 3%JTSۖЌ=FALMb( biN!UcX5RΈM; PP8jO vg>=d\VzE[zRZAkFĶ)kN?)V,daڊ\TF 8w8*r'1K`yG~U bTD 1L8!2Ht] RyGEV%k GĄӑYQ\R5+"f6.΂ r69&)%mlڦGE(8uHn䩸Hq'\pq׊G1qF`d X1YkI6$Ujp1M,x(xmu<7#>zrG3@غ sͿ/qܫíG67E?P7Az+XZX#QQ aDVPzW~UP2d|uh0g.?诿hӢXGN`^G!/Z0Ybߟ< Zv݃Gx\-ճg"Ek=R Fgu%{):5h3QĒ []uFv6$xuNh1ʚRɪ쁊.oB; qud%PY}KF%5!b& MAA݇8 ~#DrB>^ڎMR8 |.zt|L uZ鳥j`p3]bmkMC+iѕLkR7Hʓh3~+, ˺lIբd P:A[U Y˩"o&uR6f$EEib R$ʭYd>:b8bK0v[i 1/|YYvweKY_Yegl8WbG5Ǝ ;W,K~M&VJp98A('(j,)d:;㺉geY4\F|qs_NB}avvzt~1f罴#lg/,]B[vg[cUf-kb;ʊ{P{kHEoZ]"@ 1t = VpX%FȫZ.qIDIN&|/zz<;?}5+1ߦXo+&,/:Bh6M1֢bl]%Z*C흭K=9( o*s݆c>Oʢ"|[b[cv/Mcοh ؛:wEb&J|W`IW=bk+j9P V~1L4uM=PZNQipj|'} UQe4XeMwpѽ&+ɼl]qƄS/wWZȝ͛}0>.SY?GC~8@s=a1z:Ld9=8dzwfB {̼1rzh[]mK^?>k~7s9/}ק1nw_/㯆>|+_cq=E=MFԳ-6*Wg|,5n.n!q`-R!F߫R2{̐|KgAֲ:|HA}ĪX́|;*5<`>G ) V!or\jn~d\+ohn;;[)Շr1)kYAmQ\Ȑu佧a5jଓ!W&DxDhky'`H%PI%+ԡ*|LJJASJ rqP:Edѐ' ;6wQ-9l5麉GMqB@~x+}WsawmgCF?!|{C:|'g'}b~*5*G`JOVIK"H2 ^[WVnk[L}bl ה L[R˘O_)o47- A#}Pu0`!LC*I9ϗAl~3E>< b8Z_ЯW>mMmɞZ1BhRCQ礰0a\&/$43DU! hS!%eϭ7Xÿ#)ES,ia}F>;nMIhЃh򕝎e9"_z$ WhcW3v byknP+_?Q3SQWZl+ɦj`YEEvŔH51L߷\m7 d 6[TTPNNeEka1T6c? ,ѶJA*b,!Š>!8{٨WwHjFOSwBm/K{n*%ǀU9GfT"YF$ch=b0@Oz+ {+f/S-A)@Y6S ߩ"n '.d'ՍjN줵_Ȅ֌.>ˆj(f%Ok aVm p^5=,ho=WQ>t 4lL%DT\㉰`q@,+Yh\BhoPvlfeh$EFz-*|:'{GxU.q*"fy)cbḆ\|,Paԣzغ!hGT#|NO&>^C uKD`W 6QzːxۅN(`ѤfC#C:R\S(J75ᄢJ C^V>=A u>&pji.Eߦ1lbpb#x'[ŹtͤO{gu5տa6C#q$BHP¾r\PAE*$Wrz/:,w22g|y>;ݧHwN(h8$jP` !o0L94yS=Q;9]Q:Q5[lޕƑd_)~C"@00= Lc잆b1U"DV\ZR>7k!EUrMօ ʪqs"DfP]1V0)ژ u4TU 췫9wǠn>r}M[?]#SWCm={k-r+^5\W|itQ`ZK8O<g̷&HP7DRNRӛ%mʻnjTsP.Akb}Rt0\Q;崶ӗ{9ؽIXʍ8z?|qw?.t:S9V^z@)`ɜA"L!؄LܼQU'Bյ#WצRyOO~_o⨎9>-eč߶Q|PA^,~.Nw~E^>]l݇O [ӕ DTِQbd:mt&9Ώ%f^y>xrqI!.Ok'oѸt4l_蜱se6x}f_a>%s{?NR֖v[!nLVg )ۀm&5!zIKPE0cRN›1-.ot.4B%$k0pf'uZhcVB\G҆^6'+4ך9w 2@ߥɷ]4 4T{x6;[_Hۊ˅>:x! :Q9E4L];?9J}d|HEb5Zڋ:fxyF9Yr6 b|jŻI$EU^܍S5ԺV"ωRRmu ;$ףI Y0^ݑ1̙Kw( %|'ԴIF+V Vh4v^cOI=k],>YЇF.\>E=uk,R rSȘgE¥:itAڳRȮ" oOMwOZK[Dj>c5.d95> :#,Ia%Wl4shRsM#`BY drVѾºL$E3bkk#ѕo(ZT05$e L7¡qm6 (Q!}P/r v( aT@ߢH;X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@*R<'% ~NJ Q!΅}J X4+E%b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V}J I "QH?%~J dX -*e%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J VUE@X@0؉><%O^ +g%7@@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X JoH z[{?_ԔZ=.oWo]ZguX_7Q^H.]}ZYZּ0zɊ|f$NXzuX en]7K8a9{}m݀/x·IQW8Z9[]6Xg6f8?=9hۅ$vyl ;n֑ZȝNPPgONƗc7ѲY;7lM˯bVuޚP_"-L{ua=8B 5)1ԎWu}\ViVGmxalC٘^߆7{Ap}삎<Ė@н2?jdT<ٻp*c^0-![d݂!MQ#P`wl{3,lh19p9Qw3FR\`8ҏկ`\YfZV$ZN[U]R5Zi$Ȅ$LfrZ{XuѾyɔvW]K7$o_roߟZ!wqXFneۯ׎}yMyKg›åkcY/7#ȃ n<Bo(bY3ᦦD:Iœz-$`p1UaVIgڏ\>MOړTXKnu-=GVBTVD]xrM[9*3<3Ky1مfݣ~ @KQˇqpaPÛh6ɚ7;$g Nd/6xϵ_qwHtwz-3酛@J);oydmRGu&~CQQa,`x!Y -~& ~I߻ kW.4%?W .g\ }m("Pq't!g"K'o#H!91"耐Ms񓆀э)m;۴3*GiW?}~#U1mFWߚO0b3qv>jΛ.ؿaD8Ϸ *ZkTܰݛ2RfYt"!Q dbP0'Ƃ񾿺5:S>h?ߝxE [۳:0ٛtgDKY>YNr1~6OW.OKҲVyW~k%C钾[v~ !\:(S;yG(b1 m2//:22>7)|{:EOTnDץ\a0WϛZogY^gqT8Nǯ`%dbʂ\証 &6F >pd01ȯQP[PM] ivzf6YɛVf=,%g&6尌J֒q z `àߩS%IMt$үo~rzoǗ?m$jwr" Gb 3XPN 0$HPCQ9` Y5c5c6rvk|X%.BYFT)J}O2no&ώFn~@w;UB/XcG5HDlJ#!@Q: r(=Nʪ0$cx6 L5@&ґ01qpRe٭mLsٸ\hv`q<){qlp`!#B3:@+e#%l`E'DUF.qVqBdYF]rGQpXyJ_Tdև٭K) b<?Ո2FFl4VAіPƴzPHSbHME/ӎ,r'U2jlֈڂ^S:'_g6.9V/zQ5zы8^Vj TERƩ Ki(زы'ыqǑp9#>~`}x6(*_F^6vE@uwE^rŎَ4F:;8hApQ6OFY9>,!я{2%~7U[A/CcIA]3 rR8 `iBStFG؉B%,Rp^v`BT6"0X(CAAuN uEn Xgxt kx[E:H^0.!@=g= 79$z8}4xs+E 6*GWSqxfQ*eZRɤ}Kώ1{¾>.9Gs9UۧxzMgV-ܘ=yLf8c.n/E>̻VR& 1A}0AbJĨ6xΨjgǛ5Ʌsͥm1a^Ȅሳ^Ղ@\Sa ,)[ GH1is:xO^3q#;!b׹yz X] 񨷹O02VJ=Me )Ϙ~:5;¨ jys a]NRb&y$ ViǬQ>(A@x*m-,X78z| li;lszc,QðKoWcVn ob$[ d i ;IIjef{Jl#,DP)  HU8ΠTlSn ;d 4pp(_SЌEk*#R"sD@Tјfi%1  Ay Py, ̰V&[&k89hLb ŹKs"npL57ᨘFT,\sl4`6\K>$ *$@F80qH sްH8i4"a`ɁLҁt^#qV8hLH \1v+6uII9 0b@dzL۷s{s M%&e6I[vBRMɼMe20.i3N`Zzx>0.|:yaF߫;]fwJS$G)-7GZ3'l2+H)D0RLݕ>\ƠBW{J`VtbL ),wH\:-T`@:n޸)>GO'-#<T%QfJ]u{|7h/_*$3Q{]}g_1yE9Ւ;6 'Vܦ3MnU'7#tVf2^x 6#^0ŝnY#(F27n}kږ]]3NjYfw >LV~0Ѣ]gs~ ꬓuU CIHUKC=o+W;?*&R~h2\:0;Qē+~IV褥(1qϷQxW%ZnǺBXU^u:[woݛ7߽~X 20O&'xoٿi"s5 ͚4e9{==e$gk~] DuwT! q0/\T]1HXbvH]J&Ί Xp^EB'ϴ@NKnF2("atdԛƈ10.E)>asZ:)vrg<4(MS(-Bл0Ii<<adecgƓyGyɚv9](Nh az&(CcRMwjNglz#F?fcBuvF'=<],%ɵJ7\T@cF22@cĜE>-j3]eVPrΘ^۰T']_jyZ~Z|6Zcx.-&PBVkmRK}Ws}$@q99Ơex3Ъ_V9A[W)Wu ~r?O9fBXѭ{ {!»ɠ̗^PpeKv7Qix*Q` e /HI,B=ʓ{?#* mh0  T!AcKX!Z:EEBJI!fH >cS/K. dQؤK3]~uZz9]>0%n_ؠBGaMN|`\r.`9VnEmI ZrV8'Mi~!~ 'u24y=)|~Fw㴴Ki^RI$Sw SD;7 ic&cBTs6< ݀&paL3 K"XOxĔ2grNSBm (UmMiyZ,} #i%`nOS"wng̯iVgf2yHe|u ZEd0̧`(3sȹ R!o%F0 hР+L̨3hȋʥ.KIw@*Ut(w%l8 a?u .V2m):5vEcYeIi7a\|HG`RGTPJmd2*}1#ƨ"UWiŇfWoƬְ]_c;0wb BZE̝8qب0]0,XsjX(@,d k,)PV8dYR);RT=[O>:5%1FzυgBX8Bya*oou-tN^#vikNjp`)?}͋`}^pD~'S*_𡪇L1ԟqG'q`P EpS Sh5N*iByJ * q(N 3#=hƼ PPGCǩL7ΫEĖPƽu1 3Bs4fP}y&U4hwXÿm7^3앮E]i U䧪\c3*LbdE$( Lm`yT'"$_F񈰳bSbq i$q)@QD#<2S$eZ4f$"﵌FMFSg#gπi%S=F.Xݽ3v|N!\c;=<|h9Ft`%O7FN՞V}x= ]݆i/4vN#"a怎I_,e,p`?p3/Yv_֋e[-2mNd"/B%]B1$]-þg>NܐQRl!g+#8ч,|V{&ٌ֩rNPLY|ynw+pnA}nSG!̛ ke6l*iioN] ;T (XэǿȦ9 ǔ~,@W|UϠ}| hx}NŚq ҡsRXE*y[cImaQ/J}K`7 Y g.m矏 &mF/T|Nh!_KOdz[Z@kÆ᭵>[B%JY& IXsE8`[ .`hGEdkռF~aeMPlwQ(@eHTCnbq=p /3Fq:,ww%ٗ93/UMҖ0Tf1$dQDJVH)`. GL/.? 5rH%#QO 0@"A6-"F* 5{b D?z&NZ]98^sz'Vll bC:UtQFHb&hO 0AVMh1`z(j.v^jī/ݝXdͽq7)Eڥh4-1hfV{+ ![ g?}雽=A OO,*4>sy=a{[[Vjrln#}3|˝n52X'>L#5ǘ=fC$t_-qTN.}\7/wм]ٽ?[ӻ^2tG?no̻_'ۭ.7MlzFS;R=QUi%-!]N/i/}cVzh"-o{[s3YͽunBȚ8|`^ΰBg0v/&IӜ6:/ˏ|F&U\V:M=%)2Q56tJBwʑat3.SM2OPd#.E^otDt=a…<.`)2U'iQI,x@;Pݫ,*> viZN`itYUq1og+g9x}q3y>}ɡg济ۃeDK'<ɚ:AP1M·?hA"wx.{WҨ\رYs,)e]'LMbA%NӫnFW~4Ӎos{Ny˳h W:[D{ߍ'U/S`sWu?F:0ηNNaLN0u$BpU= b7ktjz~fI'JHQ#.s2ΒQ UJE&& ۼsZKJc-L |$^Sdgt:g'7~Z}07 y}9.,@, z:^Sv릴6ڏ^]6C˯s-͑WOIꢸhh1bS6IWyM:,^|Ʊ+so=f<ّ5i'3D@̴S f5JARQ( vD (dE&uBC!9ؔT1$rhm$Y3qvޮ8$32EQk23N鲼H@oR@r A[E*T:C ct 3&_8 +kdˈ ZY(: !8~c+|zb5j5ZdI#OZaoE61^`Ү8\ E^7)RTZüj7:&㠶^[{L˭|#]Xi@eM.$tOdC `E J$B]DeR}66e#BV#>)2‹z<&m4Ie. KY|!p4vcC:iȾ[i 'Ç?v`J$j2!aL6ydc?pt;8Q*>(vF:*R琅!94ISQN*^|((Er,}ى–`Abjd(\Śko7R"3V EIG.F읔,yYYƧٱʛRվa4K|C˘% XE@:9@DdbX+mLDɻFHAIdǂz@!X,`1r,f<@7;XXkP(<`r&P8kأ*Qd H>i9SZZy922!] U͵7=(a.׸TL_WtL%/l qCo8C_%<3^(0礕y`4Olv"t"6iԑaЊNAu͞B1ѹѹ=e.s%)!(j:/036ii23ژZ^!+m Ƴ}cPooim#YaJr"Rr9bدmrt*"9fH'GlbKL'^. !="]^ykK9i X+cb)@AJyY׿ Ptq=@T2z35AB\K AI)PٛY% D^`kg1 J4RU<﷯ċsH %BְU'9_7nTy=?uu5Ol%z(BXS+DIEQb != dmUlc l2-*(+\|}SkgalD=قoG~2 kQ57NxubIw~VYEiAtQ!sE]mwхxqblOVl{g7ޙ>&]?192 @ P48%( $ʑu (1NUoI>>hmQX6Ӝic}>byy VK\) aVN  -~ٞ#nހ1Gg45ݧ#-󖑻wUs|6;_H?ԕg iEUw:fa4K - Gv:ӑ(;EIJ3 K1d1RC ic%HMZ n lLHeDWEK[t$%#bbCGPZflCw7<rm9v^{~߯q@Y$w1_b\׫|el):2 1O G춅N3If,8E7nùk7Jt+:BZY/ɯMir+a8y9ߦ+]z{*ڟ~87pn\<~Cr^0y97nw}d6JNS1m7JZq8A ]@B앵WQ$MUP(5/0͞䕫kӥefo$xm#y! cqiyWE? `C1CIB'5 5`{] . XLM[Yk[,Ln|RPr DR~]6Z0T+o/pᷟ?O?w?|8ߟ|e=WG*P_OP^ݻfTߢka9u;; ;nןB42р[+;6F6x{ P[y'a$NEIQ¢dI"Қ<؄ܥDh=<$} ^e_]0zꄷ kkJKip3]aAqH<#Gտ)^簮/&Uy6XmZՇL莒ފ"8&HC}9Gt8px SO@LHPh) <1T <0b*{Y._L GIQ7ƟƓ/Nu+aC 0uIP)!ϣ\=CH3,,-7i[ Q=}%6U64Ž*?K`VQ jGK ܒ8i'$17&Z@:=tJ 5ҀtO^G55e sTϹIڤQ(3\oQ4Nn18o3U,J&'co.Ύ}@Nڥ{w~=wOqSGhKwCFhu[y㚜o;I؜K%(*pIzfIjoZZ;G rgg" e_tJR9=R8e=k 8l :%*IQ)J؄ UqbXXle< ea,= o4c_ffQeyX9ͮg~>uO~Ӡh8qN^/~ )J~@d!2% 8CD'aM-{<NC6l`;8˰%ZP $MhmcWYJ&bb j[ڲGnL!"yNM4 lV6{e6C1 wZP@ != &D%CK$1BB=QY[k~qCCAb㱈 #G-2 u4r.0 uRc &uT$D}QD40@8 <(`) k2JМ-i[EˆXLn=U;YnCu[%ESu=.n%x{|Jq\Ll-2"砃*1i$&%qq/x(xXlu<W#>Ja+x˒(zYZj[$o\*eZp&vǎF=㒿!HMn$xӋQ dJ, 8 Q*5Cd>Ud)zs 5den|˒ _ForޢSҞR#4-nkn:tx;+}B[FeLW+ ZVJV%8JD'Sf^*'Qw)TD\1(Xq j,ё1ZrBtlL}auJ}qxr_LW]vk[7Cg>Tr~ִiϼv}c}|QEdN .PLBZk1v>Lil.$evA Ri+l`u4[mMrh%sJ1&<*yR:h:wZt27^k`XoS4! vEoYe)6t ËyBʎp^ⴂXtW H,+5W9XE,s[Or_-~!d&DdRΨ`!RRKR/ܩ$24: BPƹIB$2A3⩣FksNLIExbFKpECZƁW3\"f8g.} ,?{BGKܤVԠX5WhMjjk$W<$W#܇J&-8%D =@\s3S\9ܮ*"b1074M:" \5B4d TJV;iz㩎F&DFbE>+7MYWЂ>'o*<=xM`U՗᫔%ԞkOgsU"(iʃUXLS,ijOS 8ە/~/O!΅(g ?#zRY?e=#dn53Ē@Mї/]>a,Wji܎ ߹-F!ߛL˚7'D#X4: m&fsn3k4SšG>6}-aJ]}ϗ@ @Hn!PEgp.󢆐1ɘ29m{D ".z`x`x#yd@4UQx¨$cC ){9T+A qTL$1&(řəsT9* Y@/QL\p=GnW^Jfi^Zb?[l;0UxWN$ $8m "H`h$(Չ 0O!LgN! ֧U$D;bc.qk!'>!1Rr&'܁Sy2]~青/ A@ h8_9^< )O 4pÜr$\LuQp獰`""vp]_l/جEǞݕ  mSAe7t\d!Tnڸ7-aACFm#M \ l\2=?^{zdD:go'ar̙<] X5M4L-H rtW{zu#qy fw!丣AL'NV?Zi֥:7T||㺛xuc.~Nu#KG_pW;?VK"eW|SzP*7{/toZYߛ&LGSCpG׆Fm93vߑ~|g1Qlh~Պ12JDZ{Lq2Vd|.`b݀6f!7FJ1i_ 1g2]@cN#P:NO 6%5 aZdXKi\[q˓~fy/ t)#Zy嫴4m{LJbKz>V_n(eG̘gJd:|,(BNf]`" ^ɕcm=۷8SG:HÓ.`#,ҎN)"%eQ@YgW3ϲ#@h82r+KYDa8:Ffh <;gՆ#|f B-K;˒6,FΪ)x/s,s$f2@A C0IH z7z9A`II%0f5Ą2S怰d=Q=ԪEo'vҒy/S$gCɢWq ?dVÌ& S 00UizQEp^5hdy 0EM #$`G&x"8gDhM9zn]O)$ӣaZ$/̷s{$Q3$D̜&9Vj($Q8ˈ٣{,ز1pG5 ٗ;Le0ߢT9G]^4me,K;PZ QN } zPM8@3``b&rJeD4NX'[m2>!%\  ^Y(1萌99dm|Zk8;f^JcrWvop|p}isZ+lb2X6NUXFnRK AU%LC&xӦDxAd59f3wgzK9dn阻^wwƃ0}яףf%`<{n?тti7^5''a{ɐ\/p1"GpXQ*ל젉)^|D0)e"4/-z^e۩,8kSP#f}y NܛTrbEm0~ynUoo an<_լ*zR㢎`!nt0#Vd4޷~q0KWs=_8$vzV3 D'ˡ^N7$'cM*];QU7endd *& %B"bX,-ˬHt|MIs*A;kdO]c.cį( ,5(Dk/NWm7yME.S'% IH4Ic"p|fT?L9 ףV׍ֻnwxg`^o&z$֐Z#!Y2yF -M4 R$`ur*pP`C'X{lylKy}s~+.8H8(oEO8\Q/?FHB7o"R~7}?:0MFq5G]s=Xq\񷋋I\K46۷AѶ8[HeB?Һ)?mn&_~ rC2m73a7t*`bEy~d% .т\v'ƀ|T/dtF.u (DP\:냨{ۭ@2qR"̾5 *RYd_ Reln|Sz:Qc)"!df!@(ɈQCi%gl:y0޽ٴ):6yJߴ0xqs1o6޾y!Fh}VuqsyG Zh夂 JA5P!8 9r*y(=gR (%yD%Xd,02P%Y!jT, 90>")B^ 0<%}]ōhmMYmJ?II%Q.g01c<@k#xK(^fA@|ߛU%꺸rl07T*>}4,nmח]#ȖWdBcк2I1*ϿMoM4=]Y~fs7![ɛ$! uaMVX.鏇Ŕz>d^ʭܜs#@(u<otޥjuN'|'?~djB{p,ZnlJ\1I-gMA05FYp{/V!q)mׇ!q}H\ׇ>֠7OWW¦uI#/FJFy X k  MA4,A!!>W }@8lp)e'”VR2RR:I Q$^ mTNDPd9eƹ,2k)U,t1  \ۉZm8;Qץ1yOn9oDtSZd|q3wn^nV=y!裄:i" jݛ蛉5bhuȡCLVg!=b#  3`A@s&*)Y"cx`:Zz'e !U.{'Te#ܢә*yR^z`4Pps ϯ>^.|r<ޞ>㇢J7=7n]+}?]3ۭ^%kW7(|!tS{z0h`^cS S GԊrR6'Ef)99}ށw*hDNՆ}ʌ@Rp>jeXQrm_Q0ˊ-gXGic7"W[p0q4 Qnu 2q 吔N ɖK$.TI<9 W .)iD˳O1Y&iddٜ1e;g7wy(6ϣkl3LO`,lу[/AA45K>h3\{Bwn/8_r_'̞5XIݚW dp1v9_bDmm;ߔ7lu~iXplh t!;<ߡ%tƟWȕ uoE֎]0.:ȭCk-[\bluCͬi9G:]y8l;\dSn[{=6>03Wnx힡9B$w+x+g_}yYOs6-E&gm..gpͥCqazO yg=WvYT&2;Cɶh˶ :  5yz۷wSGw<|ń|O}ᩎ5gD罰6KICp#B0Xc:qQe.MrrLu'X,2)C.mG<Ѫ:173&\TLpuLi]Bxs©>kHBs2C/^@0ci@!"&fAE+*8pc$9P[n pND3 )`vʖL d 2b$<& 곘 -+D'@`|Z⋗AHDЊ8}as=a+ D:B2xP&e%8L`H#H%y@!:C6d^\_o=w0]h(PT۲}R !^x~`wŸDJI+K-$Rb!'AE-utRLS!پƉvą{\nuC;!ubhD5$8d0fFUsȇUAl`3sԚ@h^)!Ց%XyFP .J؛bcՙm}THAl>4ArL)* ]ԂK XLAh7q͇ ^yڭ/GU$_KoAZ?[svu5Y^xP˘_>PW m3S[I.h=P,[h(a AԔU"φSvazߖwշ,x({`pA'hv,L;9U5KDԦldRIxvW;#Uc DEYk}k2U0!De!gS=:$mg;&Άv6dfq;3[pQԖQ&rd1LHTU15h\2IaK) !;@O[O̵ ,[R6e\y*ZV0: juZxIOZ~%11ZǨ ?.Ԡq$&'NZl.Lثn:ў,howA&tVi#kP+7!#{-qtDшCM5TQEDD Z.<_/-l`*b%^W#|)*6ΦPop6X#Neu\6e-d0%RbdE RQE2&r)*ʳRQɷT ,"%ZZ-'i\XK %|p.@sk*R;SHzl41kcTmJK򹘬(2@*!Z*7>%ΆݥWt& R)GV *#F1U F2B J{1('Dc BvNeH6RK-AHX1J{k7qk קy2}2!NSP&{jH&'_ +TG52A6 #d,0 έzy2*-Kr mi-7= 5AmfJK䬯ZA(0D8?3d~0~=sh Obv"M?;H StN[tI,! x,D%):#D=yܖ-)蚍)X PP-VaӾjY:i6:HD[(6~Sz|z&~O޴E)X,*q @ ;!UDSCmul쉵r_ĮoT'/ln>_lhQ~n^8 %m5Ut&ZʜMR5eYf=j5Y>#:g-^emmu6=&\mfb4!D Oq]T$y@E÷\|tDbzfJO9wz\i,7_7hCH9;dg9B,CWX0pk_}nC-ГlXF\gC1B\VG h5*Ng&ӞV iƶ:B?•D]^5d&2{sq >?a~=ϗ7 }9>_q]3FKUbpJc ZCV P<瘫V}d-64G(M €] bhn0V"5sg8O{tvZnQ}ݴc_6L^{Bi آTYh@\beA+eЙv5TWrB0dA έI5U#1GQRE,nMH'1Gz##NQօSֹDSC+p:ȕ%RwZ)N1{g^жgj3A9 6.X er~\٠ˣN¸)MK_/N~6tBqbMd*Z[Z%~rkc|ʓ_܉_ܛ&cKx;㣇?\lVGaʹO=r9{ 'BޤQ $Mُh(Ar:?Hn|,7i_O;:RA451y3dԤn*tTǩ6ous5)e=y;o:A5G㥡PMc1B*Y>eem!>9;>{S EDŽQܶr880$y`ܜOpsps=KLxQNd繍Vl>rtivu: siN=\3ޮ~_{ݾk#7߀ug|V0uK]"?YIYL=.SHʸ -5d|FE6zJmP"Ż0LXzÞW=נӥDa^€vf}^JZӔb(k74lL ]v#LIMܣy)IkFb9 f_jm_(c9ȫ`)o-;aӄٗ0LÃ5۶W hw U>|n& {mFPjGbt>G Z&1O1pϵǛ|ȧM"q8臍u1 9&z[Ϙ=YIV %+D04*G7?>͓ +)UL˂4[V c+˖#w2zdH4 W&BDQ):Wm +8 䝱l`RS 0~3AiCK?W9h3h=KWG& ]V܃W`%?E'Qa~NS>g~ayeT2*q&<*u>s4 haeVakh>t|a~)naqm;br[5Ǥ2'nH;Ʈk!ĕZnQuJO/Z3*\f4;/c3_it%$w˿-}O?\~ihVsɛb6bەZů_Ԩ7.|}ݼcnx_.?-^68{@0R?N᧛;k޵u$ٿB|ؽVlL:F?%)d }d^&)w;4o_TuU0 Xhnfͧͅ2~dDEoK/o骯 ',. }k3f1}7Ӈ6W٪i$hu Z@=:M0ĥo Y+<7Jk?tsySWK_n< .Ew -8$ tE<ds4\:Dek+V UJp<l߾?ϯ|?|g8 eWzܹ Sڡ_5_o~ۼi"s5 uMS6hn'|v9v] |-B `@wt y4}cԑcoғ^T)$LA<"( iNi܀dPDȨ7ahѓmpKUѡKCq_-:)vrg<4(MS(-Bú' vay1x-.r^gf7mMl {'ky}[m{s1K^'NNf}LmVn# #C)YIPOkWkwh|`SB]Bj%8h '{:h~3z ޼uH&РNeI^ ށHL]y;LAӮ(_j:Rj+G/mw1ӛe]F܄)` $|.we*1e@Ģtw2Y8?[1hѽM%vQJ `(7ƣP *B%X wge{3H*7 `,TJP *B%X `< u,TJP *B%X `Im^ `YJP *B%X `,TJP *B%X `,TJP *B%X `q9 p ` J(TJP *B%X Kz\WiZʍVehj\)ܵA'N6_q&C:>"t" cpO^usDqLH^yk:lJPuJ !Dз-zgIDk1M5ͭVlRdQvAgN'ns>:")pؓS,BGe^0!(=FS&yiP$_&,`Dg' qM" T)JM;7j8KäNqLQ&+f&DׯkrVW z*gkjJx--y~*Y})D/lgQ$0v )Zs=S(֢dJH1rK%bdF/0t cL![2fzɘdVƶPd½yf7 -R6wrO]Z|].SZ R$+l!Ib#A>]F'}+,bT`ge1h_xZ=rug@z!Df(ͬuռŴU),[CJ6Fd=:ҔQ7t ib^FTHa 74V"~7];|)wGop>S+>T܆sH~[N8Sԇ?[wzN?~_4u䟵;K$fv3MykFqx~ DPiƼaL=yǯ'aJ G6khe.`Ϋff'߷bs-an|KncMh 4]ըӰSVgJ'~}/-z*.R~r NsF;0%(MH@F3C !#xHHe\C FN q#e,ÞZt)X;g#H VpPy,RǞ "k/%%6'%bi1g;JaظAٺH~soKv H]l` V(ޞߣdlX1mxFUG4}' #Ge`Cl[$/E [\g/A"Y˒n$7OEѴ4<޵-OdHًޮя(nhzЏPkÇ- vl3eDw/~m^T>;9ΟƶLqIlZ!v2 iԷ~pkJ+˳6w^旇r[N[sv1ΉlFWhV90iPA Vs?_3(X|?O4l"{#yX .&)QН:1RDʑmi0iJB_kݣ@VYw<] L}9]H=<6wDWvIy']x v#h~4?8@2@[r0lgc1c9hЧ-O-x0tYqc9q]8[XznóM4x%C+X|{B]Q| *nC'tnjVس!|yC#D!+P:o{升OJ9Jx󜛶F3~EmdtBgIpvSl%˜53gyc> pN Lgv78=SJȖ{ G#(LI%ٙbcՙ!qTHAd>k53ĘRј`F-dV~͍F нd'r\Xz#8wbSG2WeiC]; KIr@(.&fH9 &]}n& 8Og1k„5PLaSVN:r-{wv4 >QlŷNUigJfɚڔM2 S*Tg*g*A0`|TZ5V6(S BTs6աUX9epjibƨ"ηiH(a6s18ٸ(4TBԵDUt,r.Ԧl/m)Er1Bo|%(⠒P(W!zJ;1(;k8uZ}LJi.!4*}ªuGX r(؍_G#pw4eEv)(kH&#W .x'ըe #dAXBXѲOX ~+CP0k%d(mi%7> A ťjZN\/p~1Ⱦ8Cok>C!0L9gs)h'2[z"䳑s&ц]F: ]/Jʇܗys;UD砤k6Jc5 u1gP|ZaֆNy|2R1=(vS_uw-J?ܴS6pMA758 >6WFstE*p4s ._5{kugr<ѝŦczw*K|/u/;}C'l/J4c\ Rg).R]DaћߟGقN O6`vRt[N.*=yFFZ~kF:ok]dc5։ʦQEVt)OW% F@_K:60 م&t7?us:>2ʃ&ɥb dr۲Ƚ~8=t47ՈV/Q/~hnɅŝ;ZtjK46۷8k3!-Z'CZ76h/F[\[N*e.keѕz Z\w GE=ȝ ifo32qmdRjQ c[ INȀ̸_o[щӖUOw:bQEѥr.?\揟|5C˧T#dA5E BԒI)Tl)ްR%Cqw^^l| _2^lyB7Kk7\w~BKy|:[uEP8VXSEy,ϖtg,Y㕳~#;XT !j 5 C )A/I8>}4/ښ8>;}*>V/=)-sr†E@m$ɉV$rpGqBc7Gu/@g#fCw-uLm黺iowml0V|6-mfM=#/(IW?']UYcƧ'3oEzy#7g\߷?QOK"UqTR|)rS&H(kO$bz9m r.;c\W6OF3fLLvk%֛𽳜"/E\Ό놹כɏo'Lf_y{rtF-N˫LL_MhREMcvG'OgW%%dfm߯m}+[ފƥyɯbdƢ 齅B 9*H`J5&Ш2)tgU!xbצvjĥr|%tN7?ldgn~0zk aVB: ~o=cnε=*X*l6EXqSR cU!q-hD Q$T޽072 8v㚷3O3/˴l#-DYՑ͡{@MLbLaePb)ZߎD^gG>֥&Z7=i~k,ws[v/ĸ=5hɜo r-ySIgDXSmP)S'I9E..fevz>IFUb ZPSPT6 R1ik.֠iZl`0ڼHAǤ60pA ud.KJ͈ZFQbjTEo]EC9}xuR PZ[Lu,$kIcЎ%V(2)Tx ^lc5kfeem eSkI謶fkNDʇ|$C(yς:\KR:j WLNAklp2dtJUJ4f!bE{:6hK֊4c%XSg*)eϢ1 ٤<(`5Hna$֋5,ЉXE,1 .EEf R%2(ňaLb-ٻ6,+V?S/eb ˤ Je~~N$W[LRmɾU37"rn=Nd@]4+xP4WL%HNg3* D1p?cY ˥tr\D/@B]^*<+f@57]?3.@HSx\ C>@BeA ‘t,8\""oQX8<@Aaa#fT0vಝͩ-&lJ8dRP 3d lXXZ"2I_~N%+%*45c@qp0se V:Y8o-Ex@_][=U Rٕ27躴%1jbᗆJ{qd;. -Uztᘌ;`ɗUXzaBzPr 1!DL W$X+Bi ,h\?{  Edu״$[`b+Z`8FL-FT`OmFŬ׺`XC[kpjhq=(//Fzx6 _𙻶 FxS9njh|(Z9I%\ EeBw )0HH&P#' \ ; F7nkɰn$ >^.WȊ"D W}\!O.DN];hE^gj CG=kL;b5(x .1#dnІEHeCT*{Gr领J3 x2YE9nfYעZPWJchy"[6BǍ3C`3>[SՂEyU|+|ݑ!5,x. ~y)4wGZb匓%~hO/:g_O?gwNe<$TY&9!x|YͪPOϿ痓l崗p;](a;*mkLtRJὢ{n _Vb=?5>cf=PqS;cwθʩY7x@X&;íqE1%N&e+'ϲY~.~e϶s$S%0ĤsJ)dU"9X!YT^ }TBΞORB6">IiXu:.\EGGmTpu-ک8u쓬8:M"'_]MrB`tt֝M !m\ 忽񧗫eW<~]>g?~c=g>r[؍#>\܉6*ӗ\k Z* 7hJ>W]POB#P+b٤UG.7٪(=+6[Y*o;+nUE* { ܽ+<@¶.f6mGG0<^6 ?J{_=7o8 :\Fm6v\\#-~u3M%?]cI*s1uznr3c^Nyڈr;E>&O6܁ۮ]MCUv܊MV:kq5B \(m+&QiuvkM_5f_Xm^Vj/c|gowCj~Ү&-5b鮀;lI ~|b7 G]0R?ۻwCps&^L7{xy~ne{lbqp~exh]t nϭ=m $-T)`cJHi=s.C>#܇<j˖AenN!5MUn4}l! yGMWjLp ҳ,B+E'&ag-p3$=>X9q9/11a4qv #y9-pf@ݦz߁'7Ӭ}X"V/8Щǁ>j+w@y\ەx*.H?8οxV_IY]Sҋh#+@Qic2&m#+AՏfۅu2cZ c ]Bn` /ěntyz_r7 r㧻e0sZ =)bK_¨C\/+zfQhҶQy~sܛ`xa6̽,?oh(apY'L;R0^}bHt={34xg1\ E^Um1/ڼ=y}4n)aj74YN,Kľ1hs&P┡n%!]ϯ~O|=j'eeC #|hFri{9Mۼ_ v[^hA ia oUfn1Y0 דsζ!PK5* wkl`:7G##d)-B]JM7nXRجr8MT骲rjM`%Z(K֋'V㾾ڜ>Pp5?S18 Q@_܋~!s/N$[΍ Fs8ߴSTSɸ. T/Ij?;WT)s!{Z\y?d)0U|9Sөf@hδ3ME(RҘ` kFqhh;+OE4nBGn|ɼ7^pmgYeZ!Zzt:Ĕr|坟bh^W?|uw5>^oMCǬy^d' $PI9fhLvv Jd T,*9kWD5bֳGکB ӎpl0xQ!pZϤIITsՕb1<{9&ΑvOl{@Q[1hXJ:Hu ޵#8,̵MapwX.p`w \dbKc߯zX%[;tr7fWXHsD6!fW:0@B1Z: _L{/,VVsɖO&Ao!d9d@tNEVNQd}JWOxJy{/Rc)$ ֹZ_8\ E)5lV&^&ZM&ssiou'A^; m0j5]<'O  ɕIZEfж<"ܷ)5_`/yٞoCdȶļR#?P|FC̑T&벰d* *@O=V`MOIc{t^ϪaMx~{!nvMVQZ.v HM^(ؤ?"dw==uPO*:n+oS1dBQz E{8(w(TfJNDrAGTS{CX3>15oD"Lk >$>fL>ؘ]%u왹y5~hnc1K(`-Hɳ:@D60 VH#wB g2IklBH%;h/) 9}2XYAz*HZlPCR&Z*%-u#@M [gT%,rl˓| eR[>bfo^i{̃2y/Rv 2Bzk:`XNotJ`zpt2o[%^;ٛz_g 9hzw]`:c];2x5߮Ċ<;'ȳa60xk@9;5x6W}^;Ⱦ|9ɒ1PźCMp) MZL&=LJ1y*~7 uoAH$~Ok2]g* $ "%HY[#&H",h䠀(A&_,+W̷v xBz$?t-_J阇xf˼-þe1;cjh mLXr-vSsWLI,^ڰ'Xo')M}F oշS$A,_6zHFD^$t>@B,işslϐcos0QL ;جpnX/3jV#şs7Gjj(#ߨpFU{-B5yY{zwx}ߖ~{j.h7ܮg=X|>~_ϧ=U[|a{ہg6z!|1k.Z%iހ҃hʹ煠z>ݤhv5tܽ4*=TLICB)xRARxK)ǥD}8{a/(/<(ρ)*dLS),JFaK`5s$ UȒQ7$k` LFmrL[hQ3;Cż|H..Hhyn-?`::lj}%ƞ}ƞ~=ENf[1^}!pջ#Qӣ_>V>ѵ^gjcxֶEy?L`aT>/TDbHh!TSҌ ZO4Ơc0!5<{ D9j 1'|BRspEBKL5l0|o1 !Sd格 1Yc"a:tQ8LJxQz~(2w[U[pWOBT&Gq4N\wy\6=f4n4+nt{j>M+N8ëz`d*jx^3oT!|$e7F^tpT@(RܪuT0꼸<1ڸ ߾sY]nY\nr[ն Y3)F&]3{G,)B|<0Մ{Dp9?;j62DO锟\77~'X]hX5*;à@0LoPi7&4 >,m"]]1^٩OTެWIϓ[@!.ѩ)Kpg=5(u =e!%]n (Fu_YFL]_uRSŽysQ;2ܹc[S^F㪗n:뫳t{r;KfL.>-"<{#J$)G%a%'/W#0ALF!rq d4ebݚC ޵҇T|T rȑ@ T# =iR:@֛I(X9wykYؖ h &)Er1  b 2̃ tX|^j]ǨI舴,">y⑥eHV`EQK3x2}t9ulMKnA[9FZ@.M=-et[tt5]jI)fSI;BPt3#̋'SBv XL5fl s Q @C?j6pSpLNwݒlt>)INbH֒@PePdt)`Ib5cyrLTLH%dvcoI9Kʱ-lqR.>hw$dE jV֞ssǔNi\{4]Y3BmJK6I !cx =*bm^H /"߹zO#҉ћ5XщPD5TM(ݠH*ңvjSTꔕ! 8*5> 2AbLzvFimm]ۣ:AQN4^}tL>S*4CYx"w* 0[Gu6EL̻=d8w g[CGBG,-z+5VI!JLAI mV]._Ԏ7qH`nòaaZ:EUyfxcT:/:>/&zz>n,F%nuqF]v(:_BQ?}25F_rg2;s4 MZ{דOߍӛlamǂkIfmr ׼_dO\֙T*>ŃN<8g#o}7?}_O;^_޽wo¿s9α6]""^Hw _WZVCx¸Zq:d!9fрFzqtK|=h=P" R=bzF("YET@םPB)B? pzsvnL2KeDّyC${Űk|dRW+Od2lZjA-0WfCNhٗAXR5I"[@Mq_Y_U &Ax69Lq,\llĹPQU#jSm7ӓBYCԺ'q_Io+B,OٜA]$[8(&1m7:ْR AԈ7:'GPPJosBBIiLWi4c],Xh,(^j|R+v~t>Gg%:rAۋLQꖂb8-ۡM&GxVm|j 4Tg'%tJ b0ibN8* (ah]cn&݈&tnۂ8`o=:$IDۮ( ($@ab0qzfٷ+z KƩŠ18#=df؋:[ƚEDcD&>rGv16n%,ŦHNaq6%:YPLH!:ȘQZv'cdOZJ51"6gVu`\vsڦXg3-Y1.\M8TGb)Q`C.J-*W?"1MZbM.niǚxx1pVECFTZlp'YEBܐHُh>1/R a2>=7Hu,: u]0NztS3:~-qPZOut2+E.'A7ed#kLA!1TX0Σ,3KyDf^GM>'[,1X3DAs.ʢKL3#tw AG+U\_j)f4 uZ\'|$鬝oųPwi|0w ߀nb1l p.J>Monn< NX_1Yۣcwx4yǭ7}<1t*EPjz҅}ɒE*B*{tI S|/>\Tkm+Mqhk={1-8 dD QB*(&} 9S]j7#W-נͻKmy7VjQ7:B_bgqihM(y Ӆ|Wk߃3j>e.YL<|;k%eA^g6b#PED * Hs([i3/TFcqi8%휔Q@He!,:d _@j6"iSV>m+qV', @z&AsV2?9超B68^_| q;ص4Cg1uxЁS^w`1Lk4 zHE^.d؏sT:-m%`G3:H>k.y*b=(祵JȢQ(Zh&|#zv|4eMϮ=Y2oK]٧[W eKRYXwM~&+v,xN=.[W j3 (;8M7x=@ȆK⺀uN)={ywU`SYuߛuB7Ϯ758{c? E̲*wsp 64:F%A.FňLhEBvwW^S@SP2dznF'I"OŬu'|&`Yr>ļ夣O@$0$S9/% NZ(>I}^]Im:5]]meB})km(7;k^k^A _귐^~?J?nRΜk_]|neUo]. eM&^8/o|xA6 ^wA+ߡ+v7/PJi%-,d( EIGCH5^_6>`$P6)-1% HRV+cu(4^R i U }l}z8w2&!-ʐX,Dg|\ڵx(T<` 3Ճ5T :8%稂(!zC|i0&;/;- :IR2Fŭ]I=2 {8W-r'2i_in]6[V*b416Mo nnm?ś-찾JյMn-鶡o-SY96fԙ,2lbNcVvbڐl&G& 9Lى T9iS>IAI5\6!ǫ-3K7΅y"F32VgE`ۂKc̲TɥVKR-JYf4c]~ .Ǘ;+iw%|Ͼ>wvZinX|ɂxiA'aM@ht>H )5v+"3%' 8 Zx ̂Vɘon[ݹc2D9JF "cA;< LdM4C8eN# J:~liUO?4;2)Tu.NhvAM`x6?wl8T+Τ2?ZuUV>Y3~޵{}?sdo;yسa,ׂkUUr^/z#_pPSFxuk?ktsGj VP՟^f.qHai_,;ؔI6=UUDŽT*6E'a72o2b`ɐz 2v_1DP=3~~P)/_zaw@(wX_,0l%yN8{AR~_疌[$cd1}zvn[(\D>(zfY nuǩߗO_HzG-kHAP  aɆ45"D$^寖ᷠ{vmP /[AiZT!Uo?Kq4[5[|HM]OEMi+M?v.9(z|W>9f@bS=+IW--./"ZkNN\՘1}f ͘#rL!E'"+ˬeJxKxAIrBS,*nd:WpO% Ը[o uB2fXKKmӌ[7g!L\WoPSȐ}9R@)r02̣QAڠ2Nxǭq6RQƗ$,j#?kȼ26Dr}` S4LX,&UnUԭtlt S*FH ,86ZF8Qef;Y \a9JNvHF7۲k0o]m #--!?[ZKR{ %kR`C͢ME0Jv ao7C9VC`K2) Bd 9L @gM|tBO^|JLH4tTKwF{5pྍ`krGNWm!+Rp&"KA @9K-ɉf$OK9$E`rھ9j7Ea4 Csm< &CTt9zRR 0E #tkq1JȰwNje+ƒ"jT,"W^2y9|`䈛KRpRetv0ufU$)AjRWRGg\2PG˸:2;7m5C$[9I3MwX\]{l Fp%r-mٗJ:#64W4%>G0Օ']]M//5 S:=9|r2;vR(N {'ӑrd$IٔCkh+[zmKVt%6,*I ,Xfe>6{ɦSQͭr]U3˞kHiX4r1]}^ RVT_޴7m<~9fOWj橻Xp A_Zze**BF**? E~=l ̄T~ϜfE,3]YVC93Pv~cƥn] :3xU@.9#*$D$FuXɽ <)sRYfL2#n]u1Nv% nk2q, ^KYp{ \>sVuԝ8к]>M[׎sjWd[mcp50k絨3x ;k%vyaw4/&En8ḀJr*%gY% y(uD1vn-wT[1+,gާY'n>v:!U#t5WLu9f6=v& W : 6UvY{H[='Yˤ@ztXۭ :8ε#[Q ܤ͆Wkݹ{2ܧKFM6GNXۜNV}8J5Dk'=:Jk:J=9^u ! ik05}Rסyƾvg•|.ͪ YQ.F>ScVa18A5xA?aʿ_UWM9(ȺW?}7> 5n%mH/_~H.Y Z ]M_-Xji6Nz"HgѨ זj‡-%%BPeS!)&{2} KJ+ݞv}& Q0N0e=XĜIG*q/|W 0I>Z#lx{5 Zݒ&]}+ꥋem!qp$S`76aI-r4RZgcP:䢰h )`+ Sdǿ^џ1a|B:9]OtR4qlcԟm\$6vRPw:2s>(L~땂e\]h m,v]Z;~G,ҽO}S֡OciїD_ܧV6Ĭr4Z8PC`2Zg$g!tl;@ZdAyF@jJEK,x`:$5xbHBApz)Eñ*yxP)Df e(Zwfj_6EJKsmvi?y-OcZnG#숾 `qe'«8s3Qt`ܣS+r-$(',v0uGj'v ǢԈ SG5HUY %GgFHC.hB0 zDz$@!\=F,%g29qw; B S[NW<aAp)|DAX6Gf|̲jxKֳDflM~G:ՒחojWUA/qxz@LBGN2z|<:|/Nerj'\9z~v<{6ȇ$X}{dwvFa!T QjDeKÇiTwUI2JaxXwCyyKohɇE0Jvpe-ݴgtSVA|3 B \&zゎ+,*=g9ׄI'x=$y6 ) $wyODG> #1:Gф 9ؓEds݅a[#Ak3K/ [)X%(y T ArtJ᎝7_OL¬wMja8dc1d .R|#ٓԇ 4yg_GI-LpSi}TFs{JՒEGàRs xATa 4n3JVvȅIHK 3s]Tp,j}S9k?nr̥EBW +/VU ez16gWSd/Q= Wj0/#ƜwzR@Ç4mCۇP/~Qg?iqn6ɟQw_dȲN/z ^wO <eGDۗ oܵA[t1&du]@6)K}[0M.ƓO y6rq qv6:7K؍əaKk$sF`y BsVA|m̓" .ƼoFT^ݽJB#^s O՞Y9;i*U{yak_ ~xG:=r矦N;1 KU| S<FHYp|?f8۷?|s9RSD}~)}dIK\?}kKmNp3{ 8wAe(>XzU7_ϖ?o`*Rnt%R4Ӷ~13Yo(Yy݋ QֺTғwzZ!(ϩة!fŠ {e/z/P$k r~6d>Zv>^݁'mS$tlsnU՗.cﶋ-J]6"5;竕'umbu"/6kZIs6GvSk O~kqD-48ۮxVwoO[]cImlT<kykrk"KT MP9Wm}@/#KX5__jOY=d q$j"q1C\601g6@ H`"Ee <ޛN7޵ݿ%{{ ^V}96\~ST u ̣1(C;_iJf|(EgĀ2@(JV,{i W ~D x 3MkM(I:,$Εr CV Y0,R:AMZ A!D*`%2K 4׳:Y罔&;kkay/3PNVo WqiIh x+ԳˏIVa@Y).uGxT!. * &YDHd ; M:q8qno{tzYԁZh3T.SrHf^0k(Q`%ȵq$dz{3Gq,H@ȅ5a `XF 6Jy@7t:YW?&6"Pg., PtXq`\ e-Z0G^T>F~b>YkFPwӟW4SThQ1HQ"ȝӸr4̖+;ZZa6hIk)ȕ_B'MW&x,W0s+UOqTG{NG{(dG{Kx{['jXN8΢ q@DOd;$4:yT1C,'n/"'P\0__$+󩞙oQX{D@@DDu'XO&)X#Q 9kM9G4Ɩ ;z]C}pHY Yf}52$r"!lg l^"PlҤgߓ=gCAX B)e6੷E"RQz(\O*N)RQ,O@[Bo!QʞS&2F":߹Ds.{&-|ԖhI}ۧ}ӱRSU}\5yK} cدipb `hQocl%ҩ"\@uqdbݺDJ"ZA@m ݯ-S`n[`,0G&fծy> d}͸Qߗ;ܨ Wrn[b >EA |,ߣeaɠ['(pA07/i^(o!, /H/D B"~T{MSz]{\GG~|{|5|LjlTgEv1VL?pMXzQ] Z=//qDbڠl|Ui֢VEծ-MӢTl>Mk{/(^EBWMsYe_(>!,QoP,OFe_o^^ m)9ΪҺ\vVu>vve@h TRbK/-.}}v]nԢ J*NԖAqM-q^SOy8qf,7F-iX)8R-bVRap|QUWvi]|XӋ|{jcNJU|_C.h} LU0Ahsvc:N\28ߺ2X_ط|X[jÕۋԍ`u`vYqqJ ӢT/<WVl]?w!=(dz+>=Hޓ6.m=m~PJEIۀ2bzZh!\)Ad8\.!ѼCX{?Mar&(<LxrvEt:?-]@ܩS./+sc=;fXti%, btDD9ұwjPduBLǹ@sJ>K,h`SFEL1|Vz#09:%oc*)8R8Pkhf鵪Zv aRjmV̆^roRoUZC.U牜+H FRd$c$T~C[PݛVƴvJw5p^i Fz<@sh8%P=Don&Q[cOz/7wKX J{(F ex?Bu(~s͓񶭯h|;j9ϸ?FnC.g[ܱ &:iFtն_#iҷ%A3-FS[Uɤ<&kݻIu%2L{cw9H-)ʨ2 RO'+pVdRdL>Ԏ1g,/^R"\w @aŃ)n#`)B:OXx#&Q)ML $NrTBs32T$*\"`\otLxO_I,5mdv-zY.1Po#ޢZ=_)ȍJ#&HeI9W94ǣD?J +s=O%%^qrp:b._բTYp6yt $pTk]{o#9r*B^,> `/Y --y%yfgSl=,{r[mӃUSU_!TSj69x'z|/}CV(;/'Dܧ*1ZhXϏzgbWMTԫMg%;)S@ ukKPUܢә*yR^z`4Ppz.-.Hhk>wo=o3O %Sw=7nM+}6Xw,CÀW_jDUGO5Š吼iLJOZ7YXa:〩JL=h,ap,ZnlJL$cY#T <:7 !8T"Q.CLAr)z6'Ef)99}ށw*R`2L6{R3iRGL>J7zh،u!=3}n9_ c8'я--tF4QfHh'&)er@K|85 9ey:M`DS9T6qBѻ' KebR!a4VVW߸@'w Ifaףho(J doN,wF;B6://1 Ye(8.3l|?W]=kf9)I ?]kBAߑRs2.53| :e^o?n:t/^rfNp-ۈ]o1璮ޡ͋=m|Qkw]zh&chlDԺKfz=n_qjRˆZ6+jw^.zr?=O[n܇I {#֗Ѝ>ޙ\Oݴoݮcoj$r~,)iC>0^_Lj28T$>䖙\ުmT ޏ~\LHv()J/tHgW$4| 9d'pwzrb3 .hKRՠr&Ʉ*hk'<3{ŕ)>T??2&*]UpLGZ'{0VrJՠ#Iid?/B6r" $ B15s\\_=Ru5W[x;v(WSWm3<RE\xGnQJ!e⸭ϘH R!f׆FE93vl?]D8#i|"^CuLFSVRP bd FKѨD"stG:1 1RrIRt9sBqB}J.'֮Wm8"CFHږYΗqw@DZ3_~>JKӖzؓ1|pE6'uE}<(pEq>gEM ĉ&2h &ͱs27$bUe|CeK9,]_@.@k"r4q<'ǒOk g4=&c~h}[!::Ƞz=J؁,XRl2qŸ5LֽSZޢ,31dh,cZ9ϠM&k&lk62H$rLB 0VLCUJHe:+IT20t 'fy fa,e_1g+~%ML&B ߦ)S`9I.W$3 M$b:d59oz3_{3a s$BT7]xNZuņZt2iPKWHN#鈦I؄9ZhXh 1H_h"ʘY4MY  /a5xgWou VB݃BJJzS؟Mhq{"3\4%'j\4.B#Ӎ)7%3ϩ*6*? ,6K8;٩=X)ꡥUzzRιԿ}U'zsGNj=Vp:z6z-uجa?U:yߨ]uJV!ރG&C9d *N4 R$`ur*pP`C'X{hUAt7Eje|Qqe,u>dvL'B|74-r+w_ϓI)cߕ#~?=_}_ZQ7ZbV9mk -k8h[}H념ι'CZ6dͧᅥ ` sv㡱LF&LzNl=lmiGY=^4.т\dcrb>*2:AO"(.Av3ؒLcJ}h׊IG7&wJƗϝNv.>/) :spE&8z,KبP[E50:vYfDD)"Č4 -${E'4 ҅gj{8_(Av~oCq6!NAjkTH*pgHꕒ, eʞ(Ψy{ \Ɍc@u_1EU;5G#~ņn-=wLܕKɺJnBI RǧiNyxMғf٠xicEJg=;Rvj$:dH+!:LY)0k:=u W[A=r!yL$&e JE"pH9hDv&)mtrKd>۶SV&Ifn(f5v =.=ڃqZx*gTS>>LJࢷ =ٓiO8޸ ,'DkH%Ǜ@ġ%ĹYSK1pcШ3t&Ts4-(NB"KRL"^WIGU!찪 Mc,BUjfU=h?ͪ܏'/D Pfw-*My; -|sU;)T;{#jSs< Uzr>|w{1}\2Х18G4`4ОO^ /UKތ9)Fثj+a1X`!a9S^תBʳJVӂ* E'ͪI72<4,yZ5~.cgɔ ☀1+<̏X~7ҮaZ} r{U .{>njQ0n^N";-J͊r[c lp29*~6ã0:g8r'W}Pܑ+ąY fIs07k|_$%܌轕;DŽԃ4iw?a&'ukE__.[׬Hxq/+YZ"GMKɮ Jq˹n5((5 MR}υhǷ~fUf'FkWUşu@]ʐ-XCP;#js\/`/d~8.u {xڮ)i'mix^X0Y"RYef#|!*S.e9$y+;ouW޹>KoSzO[YfE<PpA22/G1k8XAS+]AEqZ:*$|j2Rc">+cC$:#妄L_Y8&USwvI`=_aI4XDAYp6ZF ,9JtW֜Eugƞ?_t^ʡ{H5ƌv[1˿v> JJj9'6Q{J sA|axDRZrJBJv[)u/Gʙ@Sb\o`y_nAcM{4(5rNt>q@!+AkI B`\)QɜD8'BB/GӶi*rohL3.1 M6:#d*`'S]D{vGyLU**uSsJH=GJ X womL(R h@`[Ef.i/nj4Hԭ!=$)KyR( [Y/gg͕>AI.B{pXj(/lK=(a^u2.qe-Tb)Hob'ܴ EY}#/6ELU& LZ37'kjɐƍ#ƛo Y\[[9d Gt鎙& mS̏/s`8( .qm ՃM+ Eo٫.=,ѯ2/7ͅN懋/^]g79+ő8YGIdxuǫloI=IusOGQ|m7U=aQ4eL>Ζ=;sx4pj0rv{ˇ\gՌ^g .-tH}~Lcid%I{(Fy ƕibp, 5ke ܓW(8Ui'dzuquJ-ISkq8n/Wqx|HF?~?7?~O~4{c,ֈ_חU b=Bwj(q2& .VgSB#3 d$ %Jfcdžڙ8w*LM Xg۲ߙ^p}T&U$d2Ƶ^i8}{\8y190R0CJiI(x?^_{.O#@;D) #UIVl[50<> )̒/.ד7loy߅޵_5n/jg+E8Z^JlLdz_PիwoIЯ-HE^=r JaZG_翽7wBPKt|d""2:z۰ѭ+s(!ԙ#`vF:Z˽BǥWz@cש`QH&m-uP`86QYտ.l!go>G18pu~b'I (isF~;8Sf^;^a6(N> <,2T?uarێeۖEƖE9M *=nއcpsCpsc9\AV+x= >Dx?m6EЦsϗ:KDZ %7lHH u4y &}fS';K~>ԟ3m[ΠSt3|\*|%~(f56~ =NΆR8ƃqZx*gNXIs:M]͞䙫kRڞ|& Ål3Շי޽t&#I%Ȏ/o>׻ q|FT[%R!)!)J"%YjJ=,QtͰzXjsXȥQY,Uc)pn˧J|||^+ ;C0V%m`3^lrIXP Z*RCJ`aj;"3҈I{p<&rB)qfA+&N:ri`ht BPcDz!wy,RH%Z!2Jq yV[]Shl#$ASZO, MGӎ,3tڴf~4>Y5ӓd:VyRn?؟M_p~'L[:EjO_(Kg7L\*0蛎:8RX`Pt~CX򴁒 19WУɦ(f\lOY12v>DP}7{̧GM,A.OzAH8VZp9hNĕ{t`᫔( zGg\3Sױ+r!ccWT PkrELl&_̝ NrfW,Տp/13ro KwzK7 F$UH_~*_;rLNgݬ)ӽY!r2ٷJn^ܹ5GE/?mOg!V#%rRdӌm`m`z8U!^T%xб]1,;!A!{ˣIEOp9j+4xӜJwY3>ڋ` L<||_NZ&|D䈫ot%:NƵ>E,Y~lwDW2 ~ϜfE,3]YvC93P~Cƥ_tX 8w.9#*$D$FuX{AfWy0R椲̘d G4þN})NOvڜs2_su$2"Rw'uGY&Mԝ4wNmd-=~Se:ҊĠl5U疾R<%- 1yeLcI9sYM˵< ݕg24p2Zje[Q|h\VԜਙye<5!aA!w>{yoh){sK3]JT@kt3iU#csoo/CČ)]ZW[b#A=z N ЎA:Z[ҵg +%4 JӸ |9߉!!]!"9aKVV(}6jK\ nAZea dːqRr&s$RNz6AQj2}!*݁[`$p>i$u2gm.Z iD 2m*u7:59+ommʒ 2=mʤH9 ŹHնĹ2*laqW[+B5­¥Df?Y/R ޜ44ߟvbh%dYD)[ k!!s6hHMR]ʻoenZliez I0M@ D. pN爙Eckfƣbծ6;vjjrp-C21 eU35@3Hh6S9:h]U{6>8LȐyQ![E-a#C$w$a5qnS_4& aXm~" q74p䴕4] eX!|P\8gGXN,X":Kuo $'3\&";RPɓ`7bIplsE<;@i;W]uV%w]]D"?sB0 (lF)&Y#$%lT]܊]{Xmvf|԰}=܂ +W"C_ki:nUkWxo7d?>R`k.J'h ]RfB4^!@n` h}[Uh$C} \V θ՜{Ul~QXdK.J)'$)+ @^T j-]v&*u3O&<2:<ˀU Υ҉V4NNfIцDĹ= Pbߝsm.4qXY jĮݼնU׳wiMqRkDnz8BG//#ɬV|t:_xBgJx[@տ:靴#Eb3/[mz\*O'i?p͋lzJӾ7nH^n7NLӂi\};tߍhX˾ϼR4Dhg%h m(Z,(hЧJR'/t| [C onIŘSΎv\yeF]JD8:)P~0sݥ̩Fٲ manvmT `,[fkeՂ%KX&ŭHBe.-sXDEJF94aEV,{JY7\2i FJI]܆>d3 J61>3W9YM[-O-9?}%LZL?̅z+\?=ǘZ:;ϙyXGJL@BZ9)x )ZsA/W2̚M:X+THR(IJRl; \g@E"Ykc!/6LfQ趖87Vh6ҵ7Nifnο&dJH<QkUtF&tM}hI!1:[%9 9 xgK+,(HH] c6Q(y ;g%;C[Epҏx`t[t: JޅT tN I\8Wm[rF\KԨʖMk!A JOZYXa9?/:݀JL,ap,ZnlJLȀ1}Y#T]2Jyt6joDQ9aM԰s(ݧ>A#pm(e[؆}mߜ=e/ Q4O;I[hyҢSX޵q$20; c!&lY'جaZ",Z>8AUZ-2S]_TU7U,d2 Xpasol>8p2:o HN%&9B/G7۟V" eKVɭ_̊yj2A03A|Д5gͮ~m Jאk3N8wyQQ䐬eqƽ 7Ȑn@n#;-BY/QRS(Dd 㞶F*4rNs\M?뤉;9AOI=nNH-[=.6eEN&oح\vkvOg}gP܆Rغ_G7ϟGѻNnt9fwbwf3!qZޏf^ }tF懜u=zsv[&nVi|.S71[Xs'_D6kGYO5{4g}kE-l2WԆ1NIotwGz> aK`c$!ZjjSFErRD0Q[_b1Jnc<RG40χ\ۍگ 2EߪV7}O%eRwӂ !w"RMe`r2SX!Q0RFŘs*y]Y Fxye^1 $XŁA¨"FcK )6Ps2hU T!H## ,Qk$ 'gJ:rT utE1rȜ`t^Oɇ="<݁#nwSÕ vX/Ť~XCXoB Fԝ@Qino'p?R %x;{j50YNQmVK%[HJIe'*|=x;@B_lJdG _^V?g;nt6q饻Mir °9S\ǣ`b4]oF?~3o:՜rICF$bj9Y$;"I٘;`~ge5#b3Tk<9 lW?4ojN**x_ilܬ;}9̩+WCqyר?N|d /)F7"4Ӷčx.xǜĭiy׏ +_wOn^~}5ŕ6ؚX>Z?5r55>fQ9ccOA$-?f<]}{ YG}N&K 1k.M3u,{I ѫU(%S4]],BNT.q.׋U#"77)i M)o!,"ˬ>oJfjRlVԬ 7Y(TȂH: Y>;zc6Mw]=J$56e>m&C<̷]9Ozuf`H)Qft?ɉ߻vb=6)OtBk[Jn.\yO`3A& Zt,^D{T-'1qz6Ll]e}سJi1ŝylh- (wJƕڕ[X48;mx߿;=91xDFcիi)("HJT&t#P0g*R@bgxr0L<ރ0jO )pU*6%-)8Q)=jd@B)JY=گuk oē{٬b}T=uy՜yx'T+7\q<~:ns a:ne4HHm8_ijf|E/ژ u@ '.AwU4S3Ȕ5/잉x Qt-YH+4 ! C| R+,Lt)Dp&5ʼBPQ|rB8xKeL3 4#vuxKmr\[ ˣ}$# m~f^*.MsDw){bG|1^n(D̈#RhUX}W(HC$MRp=JG@f8kMţx,&bEq8n,>e39d'jKS$9{v^jc goPcW J@<tl-2PID} '14X!HPGFB-7(M4qL&d'S bU@'ctu 5*% ,H'c?ARIj%Q2h0!? CB.-p2É5<7 yT(:Ga"kΩhCs7^ZrBMpJD=)$s+\0.2kgѭgA>@E맵AG <%tDH^X s!hL5 *"$4AFFRM{,"GKX'>\» 'ou BݽDNNbS qa{Y# jd3ֿEJ\+m"5YU[S;3Aźԛ__@n|8dvgh/e۰'Xow*C7{.>'ziR& ;ֲiVFZ~M_z4]~!tϵFm]` wCi GcBM܃  EEL1 F0*Z驎7葛p g,VElt*2sޟðl/ߣE@Uu۸ȍn/waɴo_|׳}#(W-b\-k km1 6=z% D=Һ)'[?ǿ[r!=gV0?[[kACnt03!D0\' $ED|!Պ TXyV6c`:qju^ 6^t]*U;Nv.\H< O,dR@`$!jd,$-z{; '4Ii և<^Zչbo7[m<C8oq6\}uP VJP+ո4^Q!8 Xv)xy,GR ,%qDEvsbDѝd4@$f>(<L$  Lջ .nEkjw%ƍu7D !CA71rm &K]͟rȊږ*-VN<5BUQ.Pܽ07!8]3'^i)o{d=oZ&0#C@09&:<:I#Ty+T.+L8ͧXaLSSr[ %[۠v g pjZVԃa Xr4biR 0k[h⭓Em^yI\Pi4JkbkuM2@8kKH]#_ݡ~Q[ڪ:u\} m9\9  }˲uszLKBfmB mCY,E[.jY6!=_wl0ҜmϷZg)3P "T8"h)^1'8s FrY35/ѴSvZu{ 4{;bVSB[5YE&s-#.кp c{$BK"C.b!Hf a8d*Ab)ЭLm۳qkts"Yԏ/X?ON[iV6SRe%) CdDnc0~v?nϾڊx'd*,hb⮆,kK0 w@܆1pi/F9 hT&o%P*Fe &c,C>jیU 9GŌ:hx,ZnkT/8w:PVDڢ pd!EQLBi4@?x2ZF!v7H-zuCc<9ye>F/q -dgZA. 3, f,yՠ6(*gUn2+{уEXYw hߨtf&= (H/kI޺0C:[k:X ?HcKk=zP@z{`}vFu&iT bV&bʅs <@bv9TQyM 2apaBlw(h(cx)U,JG52⓪ 1gigWje?{ ڰ(RVg).XZ#eBFv#vRc |,$o(A I d:"TSEcPe Y#9kP*m ƛZ@KvGYuk m,~˙@Q̠:= }!?(h`Gc(ksڌ!$jU_.uC,cdeVMjSE`$5Dݥ K0q9dK;%.@ui,? tjwW$"Neg,*B= #rWi}؋g Q^1cuȻ0K`;۷W;t0_;Y ',eD}|3UԾ @@-H1)`ǣrmqh@V MJ/F d*)cJhEJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%Ё+̞J 3PUѶ#9>h=gVg.JUI @f%!%У@£@J dZ0_8@.pN SCJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H *\uٴwfngWo 0梬FW_/0/r쌬k;[A %+EHNqS r-f 9?׳ƚNߡ_wK5ǾkO\5P:vʳs_5A޴?rL^Vw!gL[NwŐbm~jjW k?W>vQ7/ m/ء8͇8I ݜ?]OMco՗Y'G|8YvG\aߗaV{:r.K}"d8|ƚ!5_r0_9+ Fpqvr5~VK.YF?#_?|;xG 3M$4?%R(Ws:y5JntH{NLAo.frrlLƂfKo7I:@uOt.=_j]#s&C/t|}a|iwLBr%drs$F<}S^nˣ}3RgLA%a5a>5ۧt'{!ǂϑ;LoIknC/|>v>c L?D\V!kݭ/Kl0!??ϯn nol捯 G:û4d}__\/G рo$lcbC>:h3.рGCD[c-%tG`;dʼ؋K}Iw׫K4ݿ9'LE)n159]0s~\dX{>ej7s׫KW Bot_Ns@Im#ɔ_gia%z$ۄe_nfG:Vk5+FZmmXE3!ߏƘaK6hCXa_ja3IObI2+FJfk唖ϖ$&'u{|!v~9qgaZ`_cXNcQx7.o>9g;-{y3R"^e/' O%gBR~sW~KNV/-\rr|x8֚9Qrǂ;RO+|-w,H0^g;6;5'BjXH{,d(.´ݳ@I3 ˚~22 X ]zr`f^H=hf;jZ5ܙw0oq~t}y}g0ZiJ#{9)z(jg(~#h0>Ig-8f˝P1ُOPi뫐LקՓ;smZs;Mț~Wv(]_Znu ظd˷qqw[O&˻ c/6?sz6=r~v3fۖ/gLˏ7sNd[Npt b4x83K~oՋ]o&{+o+ǒKLqb"wh2ݢ֯~\HcHlG,v6_OPW (;dmI O4Y>xEwbALcqC-q;c\ʤ 2%C H씓$O~WG`ط3C_1kȴAjn쭏+L|c0xcILB/:iA?:jDJWxQ; J0 or[)Ňs BN}9,B Xi2W^H Z=b!*dxpJ9Gz% z`0Dul7v|}ki%} Vq 0ZwJ8FW .Uv+~}6~b`(?~zOX\hw[ .*&,Zȍws98Ú,uj驑<Yr>nq$8c<-y9{V@\JwНIl&7/r;lPyc"꒗YNxdTPuqr5 z]=b<]1`THh"9Jx'XIB qUGm\*s@ ;7qUIp!S 8Fd7}R;Io IaSQ0 >b;D:n6Ƴa+0ĖY)OMڳ򱼟/r}9b%@Qº{an&Vs;췷e;p1:U;hIܰvP|V%܌lDٯ#җ?OA _5p῍ &?%wVsՀ{%Tx4 a֊ ^ղeXIB'Q̎aG`'21ﮮgrU5s>z0}3}/Lxϲ*V@qٵ+}8b-[5&}U27c r Ha3c&&[#>FQMw~lO|!!࿫'b9þKQS RU bRM>nOٓ[j~0T he{;go9V_b⹱wzl6qnxc*ũo&Ȝc'7x<]=宜%L$ѥlc)G&QtB:l\ӗw-4'7ങo|E2#HWWN1}۲Yt +Et&eZEE^f*chʲʑ):/ȷR9JXZS`rVTL`@44NL28P)/BBRSי Q,SXZ)bאָa[4y!u#‘btt"X_}]s%wS!,V ˸jyj0"q ~8,(+ut=s.%ҞEhՋ {Af jAu׿ {ıֈ(nqҜAF"NbiXN5N؋"v~biNY=AFIO>7Ó , u,u%s/{]e ,:V;ѺDQŰj(6e)3GN&M~?*7F*cSǪ@j5H\:H0cUP1y6΋egx>6kF1/&29=w?(qVBqÐKދ`>L(6_s0SƼHP@B2cCsce?T@;?_'2R`5bhDA:,uL%뵯GN%cP"̱ˤaEkpl;1 _BzqT#HCՋ:<sYsoı~*z䉌^~9ZeQ`,Y>Pl EeBg9MN*ch$DĝȒ) sLx|\`w,(bo`,H/64g΋c!nQ~Kؽ˽OGN t,qfj)BS1{"xDz`\./6</U(Yry1F=bS#3qdPF) Sòp,ng.bl3!C:+'_ hy"̨rXgRP,C1=)cQb`ȏ^}6%s%̍ugD`y_zGK2r.莾,^Rv̾ttu꦳]F@D8S"e:%L # W#TPS&UZR"/p"yf_%ԍ%MsPVhiuc6+F_<}V,Tځm,15-p tj#3װ{BD$& +࿳= Nz0@_.N3^?ό|\?3^Op %A?}vwg9i}Lq+@1AToϭy9H'CK<_@ud~G|QJV/$ @1Izu,]uLt) 1:EXPp:8qh/x@$.~x I$96 `f2GXTD栉J -p,9JM؄ UZn B⎵ܖpNƠ2 ܱ b$ !:>lT8W#~Gvȉq ҷ2:˝!bs, S뉘uOc?d(]k W(`/ʊ Ӏ@Bޜ9"4VZr,uu_?d`eBQ~xp b1c(K֋e5xSțA>THflY$ra&ef/^T0<_1r=剌b {ZV2mꗳ)| ˖Ѝ`3]-)Ĥ8>j  o螙Tx1\8!iRMd^$#rK0G9Ǽy&:1U|׶ԂorZk{*͚%PէU6v?uv4IBI1~>.L]Vp0-r=,σ|yJI.nt4y xvt6'gO=cfe1wd\%6LGKwڻ&sr\t=G A ,T,\ъWB7r"Νo^S9N,(38iXIzEFEhJ:B/1m_W_@LixZӽ>U>{ RQZJQTwuM{'_)!M>@cԵ1H+&=bߍDp)2%+'>?TmHdv1aG"Kp6~G37.ȹX e">%v-A}O;a FZuoUrX݊$Ghj雍>UxXw͸;M/9BS_񭠖7 {E &'Zc2DT쵑D/nɗ)i'.AqOmt. :WmDJLJU%憨v޻;ihܜwQ9[n$BLf2>~ZQ(~,5}߂bl~Oַ9WTƒ9w$-P\5]I*uIHAa!l1x?,$i kZOo C{iU NβLp"vJqIOHz!_S-ufIՉP8}΅4׾K dªC߳^*dH݁STu؏,AKyO%oLla 0(v,@;԰=UOP nTtفd=8M0'U.{ 굆4菭"g:^L sG b13JID_GgXLI$O H E8#M$]cwelg@dYWJd[F󝬜:j)"~V|3]9=^6[a@;1'J^ Q ~ؑtDJ)^d 20k%crZ zRPQ)}b²IPD!J9%3ȉOOfXbz"WFT8ubp JKծi-0kV}>8n=66Ҥ) C*΍OqvB=WBUf~21.8և=WQ|:80coáL@_Uc{ɇ|E{q[~N*SO7T^E#yY4܋|=,q@Cd*3ËIJ>2z͘.Z7w ̫d~[T`\W8VfQJeW+_M _,nj_ǟs^$kƯWa,o%bh @IN@BrissKVlֹGc`h{hԤ$< P!f H)\4ZhhXCfFtc'$W46,9PlYDlS 98 P *iU ZtNs?jHXXrIvI80vS<9.N-mǑ#;)wZ. w4T+׋g;ZA|' }<~`U-%Ыd^ lɵ9Li)POyCv(\($"(^"pHEkZw Sj[9bHį)O)[b E,3sEH-rde -,d]̱;3Cc0sV|5?_rrd&}ӳSӊM[ָ>~eUhY5 cb*PZ#@3*.G9, 4v_4kcLp-EaԕS],XbֻpCO95|mmaz ~r²Ɵ¸rYV II0N%L]˓*K g*lj}ٶUa*vڥם,- 巐V1x-Vh՛n.5HRAsܽ np±e~dMK6?pK՞+zj Rr\/^ND_e{:V!@;䪿ozKyP249+tn5>@"J 3`!nk,&<7WqKuE[Wb/(xTb:I']_JbˬVzeCW.QLrʃlc-k OQKLK&Aiڭ7,Y=4I5DPPoکB)D\ڟJ50A@!ѣę:< <)' h.4S@8 Q0E Y'o"'"pA1"CۏW&2O!S0: v2(6j$Yr.y.)os-y>0PLdN'Jt9&51|Ta߿N0Dx YM'1OKvvpDEq\2lj@bydu>W)iʌZP?<.!Ѥ1"\h@$^z(?B32a醆PNR@ j?\hϲW g9G ET1ifM͓_UwCRz<1hjQ(PK&z~ջ8o%~{/+Li,K1$ʽLvwZpr;S^k(0E7>ʁ Fv $H\\6.\xN]4];YA1Y|JQǒ<#ܞ&rw$np\Sf^ /s]KH}s1* Gfܯq0- T~Z͜6OX.䭆bO B[/c@ ,(EHh" r>qRfcã$Vp) H@*R`0PRl8 ` (n%+p0KLD(#Cn)~tCpʈ Td 󜁖5>@oQ 7gtWiiK$A΀ /RR! e41*`籟ΎUpQVDPԩgR  <5,: TIZخtIR_[:ʥFZ#Yo7aA`I€lP 9 gRPnޣK6qhgt!]cp=)6F( `9h? ;ba7vS['[I #fqPU cmc/k S@"$A)uslXr@z9Q+c98:{[ vI@60F.|~.&?+mp F ! cC`kj TT/di[VFEH"ziaU2X;{V) dEtl:絬qz2 ,crZa"(NɞZ[K :ٗ?m^tl@kYcPP95CL?tJWCA1{,R0x}#JHrwiSƘ0Gb%ô`yBrвguvN酎%섌ⷵҜDUi{*I J*R}]=3ul5aHdQ;T۸ [ϐZkc͋URIH˕|lv!x֏.̱y I?i:.jQ%.S^*8|q$y}WHDjӖ;}g8ђyޝXgO b8J\ Lƈe\Ur,) J`^jv2Pa:6,/&uM"Er[Jy>ۜ$2!)b]&2;p,a+9Q$J*2I$-|" Ie |}X:7Ρotq\9wDM0ԿsJO;5|yڙPq%?ÍmEp7; $:߭.d I/_1(M-m3rz ~q2Ih} ZgiA ˻SH-&Bd7ԓTD2^e "Bғli>)&d: j>5DݟAP-dF)ReS Dۏ XNyMԨ0FI}ohoTZ-VqPۥ@XJ]o#W6E6@cMfb7؞$AeAY$XU 0q#ČƟpr(л&>1kf៸)dsz-_ EN .^W? OտPGe^2s~#H.)< Uǹy>iX&~9wA:sz\1FRޠ%鲿7ʌW\ o}>Σ`PJ;bt+!ΦRD-ŀdcY޻ځ=v#.(J,qaЃt2i7*SNW9j'-wGDHÝRghzLxg2s fă G{3("$90gpv*iԿ[1q!+ƀLŭSra骞+_r$7%j uT:Ob'ﰳʋȷ=xG_LO`G#'U]fjͻE&0udzZTN:%ĬH{};; Kө\e"D"vh+yb |Y7Je@6LʮO.Aoe%0S(ϋ4O,r CҌ:˺-un^t(8!N>+h O&ZQ"SbW)Fq52Jf'ogP1 !Y>FRI _y5FrLb`aalB+.=%F#v^{c[~oyUкMvt5mHŽb)b#8BNJbϱ[؎ZGy̭\PzǴ\V I&]Uh|GV!R9KqF'lhS5%_wEQb#`’NY5#'dIj|^BKlpj0mXq*b *شJ:DJN7'E1kW9L--X^, $jr7,(b1ղ쎸=>˙ s:Z^M^~?E` yҼbU Ue&Ͻ% $(8&J6T9ahlX (}o 3KWͩ@[+(aǀ"΄$͝d0]>3ࡷ,&5py愨6e0üJg+j hqSe4[aq&bbPZE,IQ]T`<7eqųkwHQ{<7|#Tp&LqW4_&OC?w~׻mn]M.ߺޢ?Mg0EXn>}?,1 US;L+-Y+AmŒP7+zs?g430vڷ?av[c>as{ϿX|WڏHaC =ۧޏ`ޱ_]PU~U%6AP8a`(1 JgB)Wb_#!V̼{t1I(Ӵ=lw\*3{Y#7(R%7]n4/7.KXuq0b) 7hĔyaɐ% %`eѷZ[ǧ㺇WbϒfT9ZEA5f<A4-_V$VVľ4mt`PMی.Uña%k58fDTTqsNfϦmB3(b0N.17folvKJ); 3ʬ= 5w~ %V -ʖyX6YGpW]՝L͗|`klB6ZxtD6ẇ57 +M{ 皐|7{~0pƛe}?tQ13)G\ N,J'}\{Mc#v JfJYtE@| 7t"0\29ۙ-)5k"!q20!B;JQ z c}(g` xM" 4Bh65~3l2bxJ #wJG4{mMeVKGq7PBWUX ~9r0t:XBĽ"!"IRe҂0e>R(iʌC_YU "@&h@ȧu&Q6Li*{J$)G4J]HFc%JDjpUVLIsv™ A_^NTS@DjvϳW^cd'̃RMRiD4diplf)BP'J )mr̈́#s{"H|5CpVK)pKQ3-ǬM0ȈamwuF,#*zYiR)5Q=B{p,G aEQ=P+B:^rnPjc6w.d?:[kv.̉H4 x'T UZ|{ L8X`ۇ ؅ `;CzhFZ)^o"'밫Iw;k8LaW؀ሱmȁ}(xxF5 l&#m*"p7[9 0C{$c.Bh}'4Y8s^u8gPk'U4m{2?n<<)_7s8;`by[i~綎6.W˽d{WpoFMN4_ħs#O3vVg_g_g_g_(8p/̧i1SԤ0G#X9g5BQ!@CdrX͗%2^_HGKe,=y^b%5"],injD^G0S#eZ2Rq>*~G b%<־e|熰i_4Frv2IQb؉w X pCT]\^,\KV%0ctxt taL;JJt+(1 ʻ&fkUK :O*h ZGp\v#b`?_nƃhh e\ ,VǼ@ILDaFh sbtiIj }:O1޶e&ʘ4mFzjP ⦏5@J*9TR)|R VA{- #F5{d]&->U<s^%A?ڥ 1\JMk"RcT"38Ps=56nzPYBҹ $׳9õLI:$C+4 $0Q%RtV[_pEXB:~e*e.߮>$[mMG [H8û8ڠye4|@ !pnY1hq@2<4lVx6O.#:k˰e;eI;0-:DА+ NPk.﫤#pv֠* [3pr1%/֘f{I!y~E㮴 YK3r#[R/8ZMb$G@:әDܐ5$m8EDb<eߛ3b*RYMX\]DJC.4s1h o8,T|*PܓJ ucCO2lRI3vMJ/*93vE7;N}L&8讙ct7x[zH<%uǛ')mDb>h|SF]>.6??1e-Z >kXh uBtT4M%t k-hS/N؉2{K2!;N]ISdqwmm#z9;C~Mr,f `7i˖%H_$E)L1ؖ:Lh:~] q.ZTKX:ckw,"l?dR#zкd!r^8Kp}1*ɩQN0B,AIQYRH2?Y܅.R*?^(K$ײs57wѲ/ZVr|8-;':рqL0 D ]2#0~upy(t 2`V[o0l(_Tj)aH)㭼RKO=. G`s cijÜ{߮ ;_E)rTntqTJd~kr?Q%6Txw(EIoo*Gv \l\C7~VV"im`iOs7MR bŘZE.<=ʖWCw6[vq md^M |@o!z3X,Eڬ#D"`@L0#; N[No y]ME~U5CP:p Ty9t'3s`%@"d6>nӑa9]lur.=~NcmWiGZ-_ς,͝:Өv+A#F.^6ŮܷˉImSD׺^ ۹GILk\zjIijN!BԘnej#~%pmҊ4C_l 慀Bla~l< g|}v=Ii0? m]X7#_MtP;j̓b4JvDƉ&nS"nI~ZUDx4oh[9?0h5xٹ$Ks%,KCs4 M 38`F3oo g IPeHa/=fEͯ07#tp}& 7Gc U\ovXlmr; 0/`f)Z5AfllC#^xq7:,lWx#x'ox33OAU-une"[&h(`i)Wy~ǜ82d~C>ڣӫgp[9ze|=03B}45FRs>pH5uxWw3m6ۂf+*6 LN`rf2[#Y "-n%lQ] F4yx5*4js,B I MV1EIҡԎp6yq]BML~mU̿ Xh淹Pn+ɮ8dM|EnP"քVMU=r-w"ڒ'[n٢H2 {f`,L(o~,Nr&[Xcq y'@Aҟe'Djpyb!9v@P\Tk߸ .4*׏h#}ް9NBCRAt,k@8gӫ*& kJZ` l@20("o뛛"v6b\^xz]%wgo*gX JEeyEʲ}-)˞}RX o*L37岭;垾}g4iK%\ A'f9/G;{pDc}O=(Ib ٖ^ddݩQw4J\r؝uϦ^ ݬŨktx5jur&uc} Q%= +b Fdߥ _(qP3tvW^aDϫ/X-BXXukrVu){?QTVl Vy FKW1/#78~"%7|MbIcfjaGPI(.zqGԕh8 {ͿF-kn*z|w}7mF}Rju-H_EoJE%2PٯeŮ lAhma(8 )R(^SeSh0#*B9HlӺ⻾[PcW}xjRB)rT3t c6΂5 9Xb륌np1VL>U 5b%Dge3LXH͓~%T<)ؑm2HE\qb2 ηI #;)8ܿ]oF)8d=[LG#nB[N !x v& ,vN%2Wι kh|sR,ۑ:AvZ!ՅٶDFn` ˉZ!-^388k >xBjsWrMnU)%8$O5Jd^~DkDLwb+PY;33:1xҩLHr07#w45څh=;P |tw{Ss8DF7nUH 4:0h? D 'E|m0cpr͈Bxn#xrEƖr\7i*FLtRRdmlrҶna]?O ȜxJdtfLir>YρEESU(ݖ{OoW-y ~HwX _BE}Bs\cǵb ` 5.r' -T=ݩj06V~w:^gw,27BS'D)xif;?ڵGhSڣ '9_i@+ۋ٩gs+}4I 2[$Poǻe]'6p=`:8Sgu~;5f-馓4X Ԧ֫\2m.FUS2PY'E} UIѽ]c6P366 "D ra'zma~0e&C .fq dVp(@Ye0[i#FH*#гO9Y|o ~u*4ԚD4}@9:: !QXqq8y:^ւ۩Ԉ_s{Aɽ0-, o`lUdѵg8d\ /G!lۥg,/~fA82?KBqȿ20WHHb1@4N0S#sUD4P ! IȄD` !_m>OgŸ|/E-6-c_A,S!fS|aƞɥE\f8̮a}60%u+ؾcо(DH%>ŋ]jBIW9&} ^sZϲEk OsjMjUU*]@ qi(m"v7BE`{kZFǔSBT5v`{'8iݮ:Sd'áD XR/, 8jN%Etoo~`B 2"0,YLmm T `6 tX(az5-q Bg-0!u K)6Zo\4v[i͸"Dq Nu SˁT$*8PLŰVИ n&.ߵ@MG.㭼sw%b%2!aTU0Y0hzd!!bfWF&˭8d2}xi{i,N/f'$S09Kx"aI>xXQ..f;# ּpB+'DFoe@6N:/XX4ᵶٚN{~8:d | _1b0X1|=  j ( w` $AO & %$"p|&144A1"EB~("RERAr,jOڹ Sf]grF#]0m)>~( 7O/:``I}D18 "L`u,X&'EFsChP!+8*i}5F 9))C8఻"jm;MqX{O{M$q _iO'I @FX5,hPF26Q0\ΓP%y{"zl87" y8'OI\^hgf^=s{76dlj*XZXip7Ac| ]1>>%+bXax.8-H?1'q]V %$HF ؟f" F#a9fNlxB C!Ez7b`ePI0'9fG>OX1x A,A rXi4\*X׏c̥Id@ #Q8i(`?I '3 3{1'0i8/`$V P~بlD!4P m| B8ǜs f]h~x bx4Xݧ&f|7WvV~͟=7mbhM_xrZPsSK1}>._9Z%XY6M/1:[ =6(y/x,zk1Xf&O!]$Irbլò5Ksz{rjM5wa/uڶ+s9(x_M `ؠ-H T*v!%>KW쁽ZisUR,vt͂7ߴyVZ/F>^ZAE_ǥk];UTgx}+ı\yxo+BYK=NdS#!g50n3]E򥛛 7f wLNGU805פX~8\%9^p@'Ծp' `k&*$&I2:?C9s4d7k֘-.#CY~2_mʋljq:N[DFWNcF;ŝ +q覎N8pw  dӫ R;"AaH8*t|3G[`(/{OHܯ]wD6vXX]ӸY$j0HJ-HVY -06|ptT'T @p{̀"NY%鲨3<=N> I &w]kl^-PI.#{[ũ6Z'RDj vhSZh=TXm9h,VeSF|ȸ7nKM}1Q[QO۸p7d42_[&vеaUv_U[k{[kB"L2GL5JAXY+pkSֿ JoGq:mn,;Z1 \zoPd8J-w:fS8Ok y`=p{-=rx*v;+?w .Vb..gg;@ĩ`8)_8;= Q~`Xx*cZ 0ޟumѾC+;!:?nµ33sz[j4[ޡ瓍ĖIFte#}^;}l 4[`k#o@ge:+Yـی3BatuXז?oPTFӍ-iR< ~& F ~)Z*Vf&_Oɕ\bBwwr,!0"-aVGMQ?q7>ON'#8ť!_?'ztN '?~&Gqi6녅U=?~r6Xn冚l-mYOU-ob nclRz$ =17as*GGa;R]·y{$gSIӤ*,(RGe3^!( tX&\Hy5Z'U\zlAF9s;d'O|*UhW,iEN`U̢11WGXzdvHu] CヲƹuEsΐɉːg,#lo`ߝHPʙ JwU{"nt*伿8u+mw. !klTx!%E9pH=SU\dDlzR0iVCn~ג[߅g0!IU60v.P[ՙ. ]fY.hKiy9COX9cX뾋gt.tp>ư6yK=f 0kV'?ޮucذOiY*%K..UK\=tV5=/E*!z~O"R)R SX/ⲽ @BZ^&N,2=)dJe5_i-'o*;Sy36v.$Ӧfu&Vb`e*B. (1@9GPG[J$A\t$al2t>AYD+6:[tia,YIҤ@0β[첊R^* cT}3o i_#:Miw!; _'=< uطDzjBIF=E=ˈI5'to:~7h~KDu"\s/|r;i$r[Ku7uΧǓ*Rm d2o Fv0u1(X'd$ c,Q/6҈ R:NG8;SE,;'0-RvL` 05O™U]fev u\[~!HuV܂7OELrl⠃U\;C͠V jR8uIq]~^Uѧ{kšjahnhpi36ƫ#@ uk1*w{`XfΑ6zXv&Jٽ6D}}Ash%U[?yc|R>p]ήv)jW7Ybl(bJw}90M0z)l"^拆_4OKtRň@$k+n.POQuP`2ԕ A: 2mV*g%PJp:P@䵡Uc +¡EZQcXg]I|LB%yOiQ'6dim2O:N_i1-#ѧalq ]5gm롦=>ƭ>{;jɕ:N`- JH1trv|n&ƛݑVhe8)&VP],OvXj=P״(H[nh{r֞V`eZz(p]cxGX-Cd211 ]Y=xm*8]dOQI Q @Oa TIDj{"ͧGGNO,ْZۘ#͒#^28Yd#7:g/9&JBJܔǠ%o6f[|vhO&(tx.e/ƫ r81 \SYtZ\ѭ݊ [ "M|n'V+"ʽ 硍nDW [!>sčKYG#R~#wΣA&2]Dv)TRШN(oL84  :;9ۡcB䥸.DS>.+5+ AmQ… pt]ޑ"Z7sjCF%.\,d%=.qf*?Ec;t]ڋ>*MAM"Lp'MbVAi6Ŝ0H alDMmM .~utֵ룥$i{2DԨ@ -ю0! 4Co9߰&XFwm$TQO7!6O5O >|Dͅbd:E%@=D˵vO~KOXd" 27GM 1Š!2BƔq>l̠{ =9tSށQJFcͽ{[VM"l+t\dֹ# UL+I\,J>ӄ6aLS6!X&;13U;C|&#B>c%)lb4}7`uqAie+wܰ[WXk{)q"rܷ򆑭,wV3A7%Pm]4 NK\SewOT4##1;I Qymw}"'em"Εu|.!S!XW$yQ[)FaSDGLth=U ,B:e2dxG|QJq*swGBs%U;;0jѣE==/֎qk^G:p育8c$d3S2E>(!1h2H{G]1e_]k~GZ\Ibe\!L~Q"I'%)1LXUG6-mZ^eO95"Iu1e8>5m8Vx!L93ڑ=l=0L$-рcK֊ni$ !)U{3cI^g1dً+G;v¥#{M eK`ft%KA p ZJ W P>2Hv#I<5\5ҷjg.FQ\ݻ#U8 3눕αptպwGBCgښJr_!xHȫ"O3ݽO;y0m(ꂫT4>0JNQ*%}R^5󉵡X`U󤁏BpEbѣ/W!blHJ 'cfp8¸KB)VIŶa%Wcd%VbsA[mSr1(:F 4¨V2jDǸ=HV3WİJ[`. #51&KVF[xcx]bdԺ2 elar}~B[ݱA2tzآ777"R.KZ+>!Q$H_Y#Z08Pİǐя:QēxzBKMf{1Z*/Cx%__6}[Ȳ rͼ5fQd+F%|e:S%7EKlH-l%EURrk)7Ea'o3lP&0$./!=SN a&an41IŐ۱`1dkbձ}@Ha~yvST&YwQp6?uM_}['~}2j9m[kݍFby:aboCx/:bpl{Wh.c{+"F+n\tsa؝[0fI$[)KtRCUNQ%)XrHAU.vS Ҙ[Q>!վDN=ݳqu.7r}M1B/ p;A0FpaI5b$:_ d˷-XFg $h]cXҠL#9&$p9drVVHZ"= 8> "KY=KUKL4wTmi̹,+t S*d]#1y-ZLK0:ע3yI*%A6ZDo/AV㻿DN;SVsgQFDae5,+#T jJ-1%W Z-M2˂ZZL 1 L}*z=%}SѪVT_t:CH@vv;kl]vuU_xD Ļ?n>1~8%}ov8e=@#m IK" h5 c_3)WsYM!dZ#Yc3GyI۫9EsrsZ۝Բ9vї}6aGvoKa*6O{)>w"%)9]={nYPz9#OSEJ^zl F3G|p֋{TBbVnZX ~.{T!# Y$1Rct 0c !`؊Dbiq}N4ɷTJ*|oPe+0E5NۅQ +k` $50쬃欷کX@~J }lP6w hw |k[g7:Kx#9K41`/KHՏztΒբBXdKeOŜR]jIO#,a2fZUkfmǼh]$ŏ<;lL/]);7!#ZZE詂pW] x#,+ڻ!JoII$hath_߶ó*N'y@$J7v E+Hz`칃<Ыl:D9ͦn2?ǠՂŭrS\o/J89sEۿXЛ ZzF.V7{%L<{~و wߓ&WlY^_=,#y^v0S&K?5뻄|9`!V3n~ {^޶ſj_{׮=skҮ(taW6xЍ.pGScBBQ::X2Krm ߜ^o=wYcIV'fefo+lBδK4:Ro!sd4D߃]68|.ɭ^WLfGJVw2TZUoĒ+P@9g-$K}\M1[fHiRTR Y@f$cf),MydY?iDRN=wmIuju4'Tc0&>wL F3$ T%D 1Em&"S^-X;N4;NS,SmOcoO^To%1]-ֻDQ $"ŋtVTIM4&XgKg?`ǘϋlZ\2}|u՚ǰW|~Z+?T!LC 1ㅶ6oV3pL&!9L[mb:rmc&67r:M!I &V< DEF FK80{_\jIbvu DQ)%[h:1ܑAhB)*:~} A2L.1DTI.eb2ZSRIkgKcDc+-iMK]mڑ}0ոhԒ>167?Rr񙚥qC|E$엕6w_ar#٘ax$Ʃys 9y&_;-NE:6{Ha \\eaD/!}mWtyDEx5}&֙v}50X,%ZݒZ@,V7)Lb)2t=F֊軟xܝEjw?꼕;ܳ)!x%uN *I`B)dNwKWldY1j甍lo"k 7(XIfzV6dK$!g ]F2ˣ°Az0Zi!uQ=,яc6z?9S-F?~NY6,[^6|i_8GlV#'fږ#q8A/Ǎy۝^m힖=1_`V*$e;R ]/;FwO.z7oc_s:-ownDf֧1=}\u4^<;/}c_zs3E+ў;k |͘,L &dg %xJ=uqYxv3ǻo9a˘oh3QɛS@ 0GCscLI,%״/d칚0Q%bbElb<;6S揫D7ݬOt>'Ꮽ}#A/y޷S{R+/v`d:.{ʮz%W0(vcfl`8 һq(jbX< cDRSjIrkV9.9u7>uI: yNTv'EVqugʐmBv~˛Krra9'"ìO_]\L p I苮'knhz(,M{-2z \(G6oР\-ip]KMF~}JA){\\ f[ka/{ޮm^+*:o`lĽ1Uos(!k謩BS=B= IEkXFba[=@z٢zFMv#z۱UNpX`^T+wI!#':=sk+cѣ2r`cď+-J>j[~ ><?0^ gYbPZf*W `4IzrJ.\3hp7VJw^:[vl߳hM˜/I L3`8$!NbjTp @#`Hw/}di]3j8ǒRI^nH쵊ؠ^7P$lϼ)"SO83㪃M|]7 o~jezf]+yE1aiUVTF1{zU 95F#5>Hj0*՚[C/jwS)$&}0^y3Q|0KM>"ɌJ.]ѧpᖄ|ꃫq4z*KS7jeGv=1uw݊Qǚ#̜lѐ#D eP̦9>5J5YR*l9#ß>y6M72\:ߗ+.黛xy79;eҏM埾3s?swns͗\Nt/w/z]> m5󾷜FgvA3'C7x/}%֛ϳr4{ەڱP9yP[JTr8.KoG1>1u.J^PÑ#dՀ {n (jR\Տ&rm{NTlyh,yUV0z= Ҩ J RcRTƱ”XNٸ}8&O36Ctˤp2 .dHJh!K:T Y TpiM_̣CL>T6 蛧p%S1's<@1̦ˬ%4ŐC9O^՗ƕ}nxdTg`p?ߒSut Zm7Mp5nᐙ@˯|<;ZM;un[EsXU^kܗB|ɥ'2`S T}E;B4=7MM]o[*a? uoSgC3C\*Hh,O*اB*&-|$)x&SQމF!^y#+S;BȬHec\7!8:}p%v' va0{؟|'a׼cABx;/Z(p̀!RAHN3g۱_|b^gs{JӹNonAi˄M?nw7}D%Yޗ?=SBl Ôa8 ]gj#eQyCEd`BsRϙؿ(b$݉Rp2WRm͝1톄]F]B+eY q-4m}zbHzMlVMY<轍V{<܋._:ߑv/Ĩ3%6~8CSOc<׊ѓ4/߽+PxmMX9 LJXc KB-M[V,s*+]JMy Qgϒk/1.%׎#; +!N5xr|oK\E v)Kt4W fPEݫ|9{_+FYJIU1%ֵI jj_y6YL_ zCM9z!{8z׼GOY,T!zLT? !{2HAC>>YN;ϧ;/:d3U9JpW)TGNf L*Ɯ[ [β+b'َ*'\&g=u?oz}ڶ/*%oU{beI͸/ lߓ{3dzy!v7KRZ6;_݅)j4g=!S 2޽&̴b *KmԖP5bg+?}?ywEu^frSJ>,yPb`σwg+75CMpP'љտRGU˝Uxa;#Z7jJ\+~~$dC5k!5D5 bPyP|#K^Kb,dCȗ95&YR5ZаINKፐq4h1GH |\zj g=ֲVf/"9"BF 'M_x\X,y=skrJ4@}6LtLRm^=Q} [>~DxjnF&!Ugږq8u6/A:b6v6ucbixՖ,%{{X֥T%$;[Je2A8qh,hI1dt#4Inl(]Y$V92]L Cs!ôy1LK&A&BrU}sגܾ'4lkƼ݅Oh~>hyU KXF >xr4#C3wG$x$Dݬ/O?*k:;ZM_]?dxrjj;uh!5avCk[sU6!fmbX*A+JFk AW$n: ^w_b1`zV'Lսby>ip;|ɟG W=G3\dY3V{~x,6;ՃkHgWp 2=: z" us׃KYbCzH6(xd3cnX8[?Z3wH$nS#ί6y$0o]{g!>֨~˿:٪RؿU 9@):cvTB>㝼b8`.ȸ;_FHǝ%(z x|깒Թ lkYW]LL<< u=طAf:}:٧WAe~g3OzQTn+-jg1Ox .s//b"׼F ;{qi_#P߮M4ˮ۲op_6hWB~};7|9Xo_NeO噸Wr4 rK|OYnߨW IP V =tıP߷=eGMGݤaj:ZMez%6,B~*cY#TX6_&^-_ѓGEoQIiܨѿУaK?"S j.bK׼0j2f*(ݳ,Ira%Oq)T1~c ,l;O3ދ a! ao*ԃ}\!c? ]o!WPWLaWidlǍ׳ɐ)q=>WEjOO?6!4(8JG_!YL\{ȷ}=jw'nѻ_U[+cwVO]i5wy}k))p~P>=Ņ~N y{ :OD%:.?\InM[X}|urž|VӋt[< f>tt,ne'CHk@MٷZ/gPp[FRi|M !)vZ߶i By//,@OtFªGPk&=l{8Ob ib[[ajNRVuɱ:A^RRJdgViX^Fr1lZArW)+2T"ϙhzuT`դ$Qw͹dr3ԸYc9d{=QqK% N]wggN¡ucP$EG1Bz"a߂ QrrXh ƷIΐ~?, Sp*_һG\1 cRΎhc6%gI9]U&UBvr:s`5s4VHvwxs}a|p7PW>N]Aq j"upxVջtX*nVC@wl0VLSqZu!\]~+pO-$TY6Zb f!AS(faN5"Ox##'L%@xb75L(f#tRwyB_n% .Fj]uBM϶h RXu9Q8{,_-v*Oy5 =iïs S}E=B@u|ˈ>,} Td6olmc1fLRͣ9e|Fߍ ֮觍 4si{mגD QCrcƹpܼMc̵v*`cSju) b11lv8A up+a%ԵrƉ|SN-d9RTz[BP>C;rN)/EgeSPٽQ0! wU׳?fY%*Hu UaJױN^%̐]Ѻ693ve`Iє];o4fo,hxv#[SXba?>no Л(T04a"LzBY'<3Մ* LN,c>KLڜI^RFlG5lG .lU:xkk[I&CbgcxULB/mƚ\Q 11F8VSdžFtH6r܁ka")[̩I97FC c6?fFi6JDCLRG4Zܵ<{{L%cNj@~3Er6}/co=Cj V/&JfqIN J?vxnj3% L-4بz`glq%Uvݚ*(P Q㈼DRhPlkD3PY7h^gNҧ8AXkUrqY&k^;86u1 ߾Ǖ0j8_[#5۠~sy_WEoP?rqZ|>G>6CA!H?nNvYKU31x8PdWPIR鬩G|cOgw*+ڼ{Hn俊K|e`CdnK[3~${g6~ŖlRK4bZrMVUzjBLQz;uXa[_&c8>:qEE%f%U2.9g,;-^s j:R2&skq=n(!B+XXFba@*)Ä6_~pu)T}Fk¡IQ\GV$4er}JՐ}.6 9%чf+i8~+lٸ2 'zʹYJ5}wMkhg AaQD1SEhB2)eoUxsUiֱy>y)A'4)TQX8w"?ʸX=Z;S4T8ŧfi. ڮFBn3b@wݪF6Z~jH3Bly€6FU깢O&^dے&G>s UMhʄ57Mֽ?|8荻; sN AwѷG|X awxNpxdڗ7_tz^RD"#0fK%{6<ʴPbrS98:hI u}Eb|Ngy3?}[ob^s/i ^z|q/T|kѩRQ!آ'JΎQ2);`% U81Z/aćԑ7GE Wxz|pT =?,;]Uz~*jd]!ﯱ RNڜw;aaz~갺-1_z8r~7_ib%y蚀CV/@jWW˛l,ŏZ#:xRuG<<\_xUamoxB"L.^hCD!1CH 1C M:GrYAjdY )d0c?\a8CPZHEtX2kSN e^9^vL=]QFRՎrv\nuH`Mi,'l(0P, FkeޮدK5(dToKJ/ [h;v܆+FɻĢiq>8qޔr+5`@jv2pHCv[0g7sQ-R3*%UԤ=cY{h[Da&d)N?r-wO!~SOU鯧 {K腤 'YsH:rG{ar]'nP $'ӳVg kO_aE&_7v|!aZIhM嚓d1hz'Wi0ժV( L`OѓѢVfY(ňJJ.jJN|"_Z6ss;\i`YC΁ ,FFޅHƢ\(B0 b7*[a "i; sZ^X!PB.|܆Esޱ:a7j"q Wfb *C#q[@0+0&/"|uMſ Ȏ=|W[ hN_=07JhoA*mZCG=-gRH)a.`U$=y]F4B(EgZC/@Òd|+' ;8b` Q"@@ -JܾX꼶(O[F+[F6c/@PJ(w!y⁀o=$ouL{92_%P|h K[?Cښ \o* *o-t"5(|7 6j ҤPٙuuw9{_]% ↅCE/OꚒHhwoC+N嬤VPQ e,gE2Dj/-9٤VF+fהpB6g-at ,1:ZO-_IH*3ql}4adt2gjZl\h֝d%#e?W&(N2LN 3oo3ȜrUQ8=u@|J}p4Ua4D >.H́nUUEښ4:+>ߏ~93_fii$M~VGc  7Z|%PJ JiaHg)UI @sLcA9>:xY u}wP( Q gb2S8cEw,EҹN+-]Gu뮣:uYl׹"j˻YuQgBCHPRvCqwmuF7aO01\?ywptoGh=:4^S"=LӘ^nd}Hx_PJ4<%jzڋ>TGaQ}'U1܀Fd?#4IEEd/i!CF|+"eVF3%#܈ E3Lqr(U@qe?ӸzȤ ֿO*>WLMpxyXwe?>T9KnRT^f2o geDf=` -^sKKB/ʖZD$?jN+sLx|z_ */nKHn+??;{}+F0R%Z)@jS +g*-BFk}W)"zHħ'*J&gT"( jS:# DVRA JmI^x BH \ilD+ŭa+d'h"?G?zGw8NjEٝcWjvFV|h8s.R$hY)dq$js;KKI0J&BO0l""U%GEZBA0"sVGKUԗ PjN~L\mXi{DDZoD ]9 wDEeD(!JX)"wd]v8?@^^@(B(m֊)ΤZg_e5?c .FncpS m}(vjeq{@BP"EX@9/ht(~Ed%B,Jht1:U$48T2N8WVhŝC˴]usSǧ6{MMvf5);ID9Lb@KURiJDh*0Cu+EL+ @FAb♊@IT:o+ 26˕Xw=`$bF R(Z(QsCr(^}RyNuSweHJR~gY*wV/WpBUVYIҕ` 8TRHqɄK\]_)f+)WpW%|:8Z쟿 /QY0hGu,<3D5 6e@`צ}V4m&yqG gWb*BGډ+{ÀgԎfRnS!綞9įRrԊÕNTooϺr)neh,"l[9~mpmĄF~T>}Y{?;|6ҼnUgכo$fڏԲXk.=qn={K<{ ʘu3l\5?q.6O%v>ReUL ͎Dmf^-)f;~2h{$/>:W=θLNJa:aSNAvÚ[M -5gM~5c %ޝ/243R&C.<%RPlVim)Clff|C6nfiL}v粩NB)U2{QqgzI ^g_Ņ*U'1nj{W7td/?ʨ7U z\UqγpT'CJbǟNQ~}j"H=Wq3VOOlVbVu]瞟 )]x$b*`Css{wipi9ޔY4a>Ux^|enTvX*t ᪩cuUqRLiU7yWZJOpӳ@<u&dzEm{6^X IkPǟߪ"7{{UnmurqBuB^ջooI]#RCE^ ^!9$] Gi_hv—_( ٩fNivp4 ^)?˄ldaЏ~|T/`\H8aeB8 k%)D~m^hW'f 9f%U[v }0E1gz8_!{zXJpq`$uH:v俟΅eq]_wuUuu}`=33bN|z;wa_OZPZBd~C1'[21.[Cd{×af;JK^%Q-'g1ެ66߿~ˏW=glgnc8Z;yK lF?GVHm^W9N¤Jv(-s: !B)c~pE$uz8f]p.WnNnLY7Sͬ6 of wy*Z'ﰎ5 }""Kg Po0`Xt净YMgy- J_%l ApQgi'%`[ޜrO`b960FA`ׅ9P`4HL NJ`U)B ~$:#] ($Iا%"%AhMöd,)-L-5W~$u aRT 3=!v;2ݔJO,JBxΈ>~lgDUy`tt}TUaH3P1Pt4wV31- QJ40Պ! s< UGȱI`$Sl vסZl%jRu1\* S";@;"\f81ȵD$r-$EG( A 4Kcbac.haXjOĨ 3ZkoyUb;.Ʋ-%]Z4%C c#T4îW}y5 Hv>ƼИ^ RǹqpIM#ʛY ͻc'߯2ÕݮX[u…K2ֹ _^od?||L\cF} <ކRdu ψFbV&$H걒ěKi@s/LuZ,c%a0gF|p{)ˉ11X v0?6 %3pJ)hl'"Q&XiZφ ڍoyjK9ېi(O(:0xOM,cl?eC56W<\\Ԟ)1QjaȃGcCC+CU`e`-S(WcfC1_-A:,lFܹ?qL֢M st0~ I ]bqB+r~IȋX>W?ZB>14GO½ç834S;\hofو!B(Hn`ВR3*{]9nh*D{tL"U yi! 䁫DXa}Y,t+QrhƇ F102A10RhSES42`fs"˵`]x[` @Ak#(4Ecբh'3h&*ˀ[x^,;&1+?9ZzQ4MF]lDrм~g>xafTy)~R V_"-Rc1Agn*hM ~1"OkZS\yzSX~,jt_!-ςly նf>p^sd{ӉqxjP+zsIVC.$GHt R̼Q XeZxծTmRM;HN+蔑؀ ̑Vp-6fD@${ `ٕ,|i=eXMkϾ~1*i/#-R4Ӿ_st6(֊vJHDh7{.P[H==%D=7STxsOf@r88UR*NU @~8Ų1?2Et+R/RZ|[$ j@={au^߹M]N/,$(7@T>qpd0~Dz0 UQB$ci" H Gn`tA5<7n(UGr$1Vǹ&]{W6R+;UtCIq1i # Pju-Ev;lYPb1Eꊭ7fU4'."KXV@eG&[%?g|.xoYϹY9yC8)੖^o-:Yg5jٚh.NRNt U1ӎG_OAy]oMw!25FZٻ<~wrEǶr.IBrRo# ]QЇCRZVfwI`ud nyWBmCֲOwn<\* #x! z[Q\zsfq7 f.ƶF3‘:5'זPJ#94T'zJ 2PJG'zxx4O[iJ<'iI ?~ Ϗ""A4e@/}_A'> wח}'ٟBٟwf2>WLe3EfF;ַ|cvw30a }\g˶zB _s*8hpd$)!R``{x(]HOOFp{3c `i|Ffz06Y3ٓk| nh<)[puzu:~a V -_*d׿_BRwTtVBg[/|1󕞓^&87suFŔnp^3N1?VSu6~ÍA<7Fx#T:cHq,*ZZY|09i!s⬷T\åʺ/T}S;kEK8.ʸ9+K, )ĦB!%tBPl8Թզfpe{هTU-xH*r<*Jo+-Qksmo4z  @5FBTx YId{O JVb1z> nιʟVMp_;`E֋@q2eI&$T$8DN)y3~ikB2|^GvUd+7 GRd!~7!$ڪv2y ,pHM9!r}ڄ^EɵL“ŋa0&9 &\©!Z1JxJp¼` P +a[{HQdF*Zx46r%zd8xI%N/qUDK2"@# ?` 1$$2V)ZSJ'6 q%1Voq*0BZxO PeLh'O<A 6Ąm F4DTPRG<N0a3FH8V HA06L(, EpZR0Xi Uc ĂiU I`#)»Hq &s۶L4;;|:#49g=,7RTg ʯM{nF j$N2jc],  S?&B9r:X $Bf2E\l\q6`*4%QHSPFA\p̬.ӈ"HFB#cƥ`o!B$ֻple>$ " i &J MݦYv ُrYNWЬakVJBT`Ʀ S$++V =z%(V Eh!*y.[$aSU`LI˪B lG,AYa.=5טcL$'jDqDc"Uc" bHL){J=%xJ O)(XXZAbl d&':5UƤ4M?d ŚJ4}QM_^qd.WO?Գњw/v.a2/ͼ)Y I  YŸA:c?{짟cP: 0KIZ ڦNR*(7b-e;}g;ГJǩ$4>S*M C  bNfM&0X1,QF( CFfr)MFJuo=\iu&ٕ""`Zf[\|A)(kpvߙp|ir++;8f9rp 2_~}zp?'?5kWK5wҶ^jﴭ.ږ˜H!v2ޙ^]QL%Zp2C#VUЦ%oKuy3pٟn ڼU o .$&Tz0WHM&;jS6ӾV=bo&w=ek.xrwV|Q f~_0P >/FSZC|!ë0i{[;׸2~|eTwgvXVLӺΌjxepd"a ̐o K2  /E:IqXLvrgA|o"bشwU誈osT07kfօ Ű~) ,0.k]]ӏ+V]6ʕ|8Z ރVPdPfƭk8 3kbZIA#)T?Ew"nWiڜFmF[mc#v(-f^;Zs ;ǑvuF>k6Y_/`]Ud8=8GԯA8®ΨF=o4+]ͺaWgf#_48WHGՕ^jY J[YjA;Aw9EZd9ѧ`&RJ.覟ˆS fխvj)+]Q%c:u9fRJ"*/ $4Wa~3M okҰTr g3ь?S0-RI3ǝYg}ĤSۃEjy-K-IBv5+az*q4ȟ e5}xzCl}}EjJ\8z*(wDRy./_+znH.pA8aFa^mŜJ['5}7u'էJ!t~E~)}\ǜbnՙ8+VШfx =W ;Yښ˳E^UwŦKB?倇~+S"D`b>@hhBeqMur5 \X/ܳupfjjtͲv9tsEuMT2XFYuoGq]Y:IPvdTK v qX0 逨5'^,uo (#"%ˤ,&>fʔs6L/V2_wKf2Yadn<< LcƷvLo3xVX@ @sM3{0{I~I;9e`g)R~F(Κؾ,0!*vж:^ї(>UiT6~gQ8qg N'`8 _L!õׯ_0d7 (k];+'&_s88pdw΂~farg:qtcCSxI?N5\BzޟN_&3{q \߿ 63sL{Oo̟ve?[%5q _ٗ.g_Os_ۈ-P9cj"?_&v"|5fksNt<\D*({(kh~\z`ǀ_O|-bfvvu9K.\ɾ30{ӯP q^ܜV `ʻ^t)nB^?bݻ Mλ'bT+?a;dX˿-" _'^zr]q'|hL1D Cq.U%T0`L"0LXDH?d:{+'5XL69O0_4)@L\;L/1|˻t…mzxπXbT hQx;J*Ups9ͽE<.A [7 z 53 \Y3WW;4C[9sZ<Az[PqhՁ}| u3#`G~jqa4`YGCe I3^H{a'Xgٌtz6gAG"/-hZ}Fz((׈jy02B7(׍J+.+' RTّl(s!]\n-tq+ҥ ɖ縇vvЃ(A`qsZhuqy;rZLzޕ s= WTyh#wgZbIRC#i1iYbT-_l)OUVW>U(w#TB/[}A(/wWS*j++9yV?m{s{-B2LHߪf&4'ͫtGfBqep,4ЈB%c `A)ry[:Hr$XE8eaPܷ`ŵ}߇;uouܕS;cc=k͒U-k!>lN@t{ #UǐR *KۉVkUqokչnʧ8F-1v\Amu\x$J3Z,C[3{?n<hRS4Ǒ?4Dž,~:j0":Oc, ' Q6M걇()UCIyjAT$\8eB$QQF,&ZP\Q `Qm&[1T+;tij"T0|vfQvyq-s0BɭRc8ҭǽy1ı1+&O7Cu#BXl`!qg4_d5/2Vud~t6${6w[:e _D!@yt)1|8*FR(4botJ8E3z Ҁ dDB+?* nYկ7|ɦZ>KgBguK vcvyFa V|l}5?$O'kڢ9A~w2r0~cnd>DP5iϷ0həzw ֽeM2njؾSyTcKcwQpm(Ĉith_*$1A*+CǖJiV 앖;۫+͝Q򠝀BbBq!, -'s^\+gճY]ϲ=DKWɩ˅ԫ2ROo`ԧЦcrJ< a4(߾-dJ,`CHdb#lH /J-fF)+CmɧmɌy 3e6*6'E{odX4D Yl8И|VcX[}JSQ! lj2&$9XU0Ѫvڒe̳%Ce6e8:Q -gdޘ[ ޳_<vA.?EfdGj ǑҩJ'&x$89Hq2\'V lbi.WDAq1䨐V'p0d 0,I*ql q pߗlxZ$IyZOٻYuBnlͨwz7wJݬ i>F,.4 Gܤ&bHk#91 kδ}>V%˸V9f90k `}T j`}oEVd~ߗVdoEV 'e3*V&GqSYbu^!I&!S!t5c~A) +/m% x*6Zc[V$s[[[ŖUԘk![iE"2GZX$Jބ EfC Y[t4|=bK _\A/MJSŮXF$"TD27$B1$X)N8g M)ƌT|Ͱ!\ ~27^)ꃺN b w3ߩ'"{fVؽzh IUDiL׮DOZ]xWޕowUlWna5 nR*N4 b if3ky0w}P-Sιj^gG\'/~@#_E%ѤFzLb${!o&!?AwQHF8{6F( ІM'%$eĘE`85U/+!1?fDhcwVv9 <y}bAD:әX>p8aͅ5ZfsMh#g (kOeDɢH1i5:ں4h+ `~n%>Lȷ㝟;UO<،\<c V3)R +CJ)wRk$ !&8$M ݜyPL֍CB+E% G"[6R 8f mfPSg#ip<߸M˔HL RVtbpQ* ;6261qJ)*uWRS{)v 1pܗ!8{j#J&{$*!;].2l i Mh*~xcrϽF>kT̽QJ,&*myj1e@:4ш Jb#2M)4ouUP˂ZAe?.wFf~~gDOw1\cm9D\շN>L*س,Ի ̱ +-d)R͡iSտՈZ%h`o2\i<~GMրϹSl]鲭%Zh;ƀI{4 Fιp87)nxP+$f<]yLwUd)|,!!&cbRԀ#[ڑPcXM`b¯Z>`NZHŸcd;ScRS(u[DrWF'NIj)ZXH²UnA}!<i$FPj`faT1’INi8Ӓqb.m=, us=, Ef׷7cs՜1BH֦k&wy/OzfBe~0`y>gLhNv3օfRLy_˟`( GnT6f.#G\qp@NVx )(R#M ûdK ''4`mjo 0A+,nS2,A A fY#GK*(h9j GSW6g(:xXY).2z0 l` 2Yc\rҰ張4.+,mD9¾Ƹ=mDjd"42abB"rcyդcOJ8%1[LP,K5%-2?*Gcq:3n1<3o|eA۷j#ph 12!p@DNaw&~ߞXV2,[=r[PcTW6vffy;%O}?u6'NpK9n._~>qtw=>w&\YfB%8ŭޓ^!-Hf2}Ugy'/5(S DpEh2=)'EVxAI, 0rr5šv[Y|ojUe mhh[қ{7DndzfcՌY6;Fˋڵy9?f*1H'5ߤ0%ƕx-( wΌ/ β2{!6@- fE~i 0[~GDDH0{;0۵.dk2!(7sV>d8kQQ,kҤ|T㉴J(ܬx:WG烁|\>E)/eU UZ!*BLXEݔūk+X5"7UpJP8mξN lƬ]ۻəT i=]5^6J ;QvTCAsk.j<[3PM|!_*.UB*ԥ[y{jFax])c`ˎ2#k}O-Ċht Bs3N匵rޗtQ|;f۟01* dCMf.\\e!t 7Bao6"8*hPa֐C5L~#-zT򌇐@= !f:3~Gb_ 5Sšo_"q fiM(p89sX=b)$\i)?U n<%7Xz$rA,Ő.cl=hz]BšCd!_*跰˷[D)!;yuf}`ܦ:(%8eN)&(SZ-qw 60Zv"ii]Q hq,҉zCR5B-.(_YVwG#U&RΩ{M*P(zQkvPzqCu67=~5~m|bQ%C-*, r{7`UkQcFm(/,[C\SF\K$L 2Ih&Ȧi!1CRX /JpWV tj~>E  c\Mۭ<4>lD%IeRU^F TnwŰ>QM_KyYj@ V{κ[w7z0yw-HcB0s bGkUo##$+Ak!#Uݬ|l͖["5Ir:Pzdu=#viE T:֦gA 2*m-Z=T@۱PJbs:rvneuv_gt&I샗ѓ]/sNpKs]5+<>q47Q2mmz?:K\N,>L<٢ʩc-SgɌ8[u ]iݔ]='b)NV\p+n=yp6\VvKUрL(ǂIJO!*Q."ESt 59#Zk $}ܠy$9\e-)hQ=%>dZrjVƙa#jp[^fSsUJM3!r^i"^u 3?|?a&ﳌ_^(~ﵕgm&A}%{3V)Kdʿ Qp/^Ry w]HY羳|C99ӃAH8gуy #*njw0;FwV@t*p >pPU{xKP l%p 23FwOJ uķ}1jz_xәZ>y hc\)NmK *tOox_Ҝޞhct#]"psLsTXzUyYHTxzlz元l`.FGOX>٥Uy ; 1W7__?2?p%I#*D/Mx"X*%iBqȃ cD;=AU;(:Ҋ7r _.Ql3&fSxGp$!Q1&UdK [ǔ'8|%31zrϝj+ z vR~ydrw_di,Ү}.ۙ`BtѮȦakP7mlBy!xJ{*uKKьeI.tofhb?6GZYtD\fr~Tputht"*?T$֒69 #4Azùk cQȮVM04eֻ2OQ(;m#+|YvtWۻ8\1$eA?mi_5$)J{D_=j}M0P&S|]ܯPO׬k.6fZoN[fU th߯n=tZْ|q'q׾۰3*{f-9׌3n˥|VЂ/;7F4ī1Z2C54(2L>څrs2P}l?5_иrw;?!E.rwCfTy q60) &8,D4ƈ=S9unV7t[2ɪ9"'|u^?-o@ٗtS XdD>uQ9uQDTSԥB6KFgqZ1&¹!$@0k X>}ޢ%NmІ+.'ǽw]]dɨ+c[L5K$j\0|Jtj A9 h BfsJ "[.=OF{>|PF͔p ǫm[-ʶ[}&=3xKjeE%Γ9^xtJpLqp9`ts-lZnΫ?7"춽Wƶf1r*b'bXuF b䣗` MBOSFk) fԊǜKMP\3 @bĥo"hn/}TMՐnAܬ&42$E2I Q3'`Xcr`dZoӀѹ_7`L)ϒq&EV.Q hnl %0BD!ƣH5$Qt."mAb*ȕ^0J̅a F<ʚe9EC*J$B4Q [fP-^뾘bo()\[rpfr&WԞu􋇿fwUݤ݄O/z8/5MMHӼ^8E[!k +ꝩ8}WkNe䲔gw^q~wXNEΜ>[(\ݝA}Hj}jE@]j'q|MZze\#w*%v:W,IdRwAX$(3 qEt]L5e.wpʳ\$W׬^6zq'b%֓!YB%oj_Hy6mlIuD-?;C aC{Z`YSgVν1L"gۜ`A+=JkiFKȕC"WH Te EuMFS;[gRBG3Mm"ZKEU6XIQIe3%w%b y9<HW$5!2TFp9y%>447:khGs)) };SA1EB>*) :Ec'Aߊsmiba7^Ai z 0ޭhB)fD{Z{]hKCsw1ih1BcL QY31Ҋۡ kŻm2'8hqdOb҃N! eAad!2U1ߏqP tOIQo0" (w );Bߞ.rl ɀuAmKmUk) 6"VF5+g5Af·"#rKg?BGQoK Ini"\M~GGp]K%\ցr:0luAQc3pfUB +P'\ +lry}Z\cq ۀrTF:xLlZdsG ^BwTt#݄6 DmB%-ÏK- \- JjKDU@!{lJ2jd[[rfSyewIN*g WR—\a// HQ. t>t%H"/Tkv>XFHNMHwm=5lbTLRVmR4:ӂ#o. `|/xeSAAm(um](@.x<Ɉ #@߽gDG0Ӈ=~֣[W]Qryc mpv`4THQ7\ ԣv[UC;CzSVh 9L#nkov&@_ ;l=v 0b-f!}9rJdvv747u=qnC}K0 <) TfIoMz9ծ!Nsx_G2A1A &Įa;cŌr 11u 3u OWOd^j+l`M&|>"F ԺF])h7%E~{tK@>rB1R[n' S".'g] `'y[۟4pbcvoݯ島Ŏg zrȋ|3NYʁq ѓr.\mR=wQ'TC2YCrcn>^Ow1:쪆|o?D¹0DNzZ`:ysynvx g,oY^ᕽX( 라{bŴӊ.ŴG[ )ݬZ~Uͩ{?zIop#;(F'l!A0dAoLߗ6cBmヂ0b1Aq7G؊=($bx?bMӎ67zJ$gZ wL/=W ~g:\d.$ckN4n~o2@?Y;1ѤxqվBz槷q1.Ǥb1Wǀ4LІǞӡ7JrOM>1[Ȧ$K:`ܮgdY:}M3?}Zʊ3Kib7;{RՃ(mD7YӛWuxPh$`T/g *$.Z~}Xji5 JдJ He,8v>-J G案im5eϗcDZ8\y,\ĽpR=ELL#Ύeƚʩ @ҊD|ȹQh09 ;3O8WƾuAPtm<l-%r>uE<-N:+f{adF?džy q60) %7 8l@>͇))^LŐ;w7ƚ=SߩzH|@Jޜb9x%l+{Z_1z!wD ӁLA8'β yZnݍ^? nم"խ>wmI_!>,(aN88on$0i3%H%N߯fHI##L<,31lItկO~eQLi6^@06\ uS^BeNIlxu)ux N`q=|(cN#p .6Qu̢ٻKQB yaXN64HkVzlQA妚SMN6LzB @. )TFK!sEa.KSc<Cwҋ =ɪ8L2#^BoY$6 ERvpyJZ^S|ӻ-glᾜn5d,-Raf4|ѳZlj؀b.R,1M]DH!)53TA;Z0xK3J% dѮ:q?W)˖wܹW)c4rpUiY%Oנ&͍qgZ1-2fc.>ԫ4\n 3xo9Y֞O覝Mzë{OqBhI(ZOFGǯ`ݳpб7a7&ptl >CB\I=^Xrӣ(.ޛtT{8qgvsnՁɘ$v ~1[)-[n]&ǘai"XWq O`,~3qwVTq_E-K+r &. 8"~o$*Jbo&1vUm T>Z"rOFV, F+2 Zsſ*jXF^|{s+/.P^/QC _m\Ǧ~PZk,2~!I~vU9.6sBiX:A&xԀ wq_?%>6onA9,QxQK$kI?x !euLF`$bGr4Gmn1%!0!:Bk }άw)pJ":%i@rxrmq < BATQD(;.E n 9rM1o~9&cU SUc{b$%zf1Ld970*ez9,fL >vA9_qzPҪt6㈍F`۽‚ G#%JР @ ɘ5TN0_ >_-P0",: b Xg a=%R\aAZp̾ҜAX$R׮c*$xmKM"ՇOU%\}ZJ~uuc".@ۆ^*Ƈś_m|ޅk߻ څe<A!fbW!μ"u5 G17sJBY̪UgI{Q}| ɿ| D ~$nYH#Sq5\==,I#P7rA侣r0m ݲ Mn%$+dvSZYhL$N;:ivqX-k\B[ EL14S&/4$3g Udh5Mpx5AA(t"%r D5*__V#MG;eza[1gw$٭gOf)ٍBO3tS gOҒ椇wbc9^z͒\}ztOc*u g2O@$:C :|Dݝn 1 UŨnРzV}5ru׉o/"=׋ו~=}1zOyI4FÎXKV0757+{{yU~q290Iw=K|%ħU?G|&E[&+wĎQXNHR+Gr K$J/QBڳ`0w1#Ch1`MGJdtLviYsg_V581!Pj7V D ɥSvsiL%T(1ˑkN(CEt=D -UAȽrRY,![ l l C\Upi)Xо{'Hf R %0v!L84I/2KVAJʀK+@a禂A1X`c֬}}GC4U]6Ƹ_vDZQu${[wۺv)f`ZѼ~ءϿ%~szgݘOo^DCI,|jW7Nn|B?߹0wHs^^̭ 9ir_3E(Q`j{"1x`LpEB06]MLhG=!:^Cg 9UG@ S`,#Fg̚Efj%@'E/8Ԉ ) w&LMnSIa?$]u {L !i]'a~=]"|$<:Ry|}LϟHҲh7=WЍ9E%@QX+U$1Bgl{ƫ&:ijWh:]0W5^N\}ztOPND%ˬš*t:'1"xqʨUZn̑xnРLP| KGD3\i:4RFhH'HE\un?{֙ (lnp9LM{hQ9Iv˕?sXY $E#HS+f9a<'j ko֎952Ej[Bݷ'\0"C(_m] Ēo3-Rh8K 3U$)BMO2] rACL< 6B78XJfA*J% Fu˽Q2 l۽YF섭ryr<]ZLSS݈% mHl}PBBr͒)gomLry":&SNFnلjhL):{sO ˃}G6.q6^SlBS[ ELDgL`%Ġъ?eLG}{J}BASpKnY2Rksb֜i׳:90p؈-5yf@"[!H0co, k$*}Z8)+i1Mxcq$b:J+"КĂ{Q=qGe埽A3YGf!0V8zsNjچ!Y.eF]ew]XXgP8sd-3lZ|]ZvAKƍGWɅۧGG4J#*1ōc+ G5K̺1M7 Li17 YZ<:8?aC+th~FU&48M;V " X{=5Zsa"0墉8W;1*Y &g@qDi`# D0k$$BiNQTdՄU`!F2Ys)4a:l[jDE?yЄbTтrƺX/sV+  9˭'>VU g-ΰ14\8e1M$3`mrg2a,jaHj4KvGzC!)պ,9)XtX*ǥ}{״YvRrҲ&G6| d6n+o?vkƳ lE}:sIw}~L=H37OWf7@f htSl$:?~xO]W.?H8%!I<$$  w()I"a ?,i'\窯щ8K.K:°4DfqjXJ䔠+Mӊ,H'$rzR9E) {_E!x ߫($z>{2&^4ZMK)cBS;%L|^}wtz͌x^WId-L&*iڞ5 e`9iÌz -rw7zA899jX$Q*SDM=-jqZ`Ph(q D^' 8UveC[P-8W(?i 'fɝU}w]d߼B*ޮ{?ps?\_W.7gbuk>8_ W 2^Pu9Sחmu9`8  Q:As8 FFiBX޿$&B>ο%JNrE%ǔB,֞u*5 Gss4k :Rу>xtsKߗ,-R's}b`S~M{Y>`q{7謙z,ԋUƤ5/ޱ6M{a)&.ZU1}oYV0u{Ȼ'v_!RNF*Dv(yΙo.z- ?Wn5E>,5 eT.v+0p<ENnrQwA+1@6 ZM3jouVTOJo?Uu8evMEH q4HuXG1;|wp7BtC%\s(;7>vZluFm })ڛmưnOf[Xffw>w}IU%ϿCP٦ܺ1`FzzU!pAn^}r vֻf/4r`^/>^?-5:t4 s"Defx#+B-Cahm,ۏmz~fn'`s[jCm~g bIwCk ^mn4Vn8g As)) MCtťGI̜j~xHH[N?{򧔛oz[wv22ǣ 8UOK 0Bx' .wσ̧׿~nG |t6$#wŽ~X[&8}տ2N>ћKonf_ՇIBb^[x$ v@*Mo&>|J-_>@nu<7q/K>DT؈HሔH)fX4Vcp;a%'eWKiuiZ}|qyRsb } din ЭՄ "ƈw'&H)rls+酒L $^'Q*X,/l|xAVu˩K&ʣ 7\gVK?6|jΨSaV4~xiΧWX?|k>{Y>Yѱ_NE=}8SzV4|+}STumTɞFE}o /'6^#UJQW~/{* _._$fM× z⚆s8h7'Usa>I2+3c&}mVČ$ 8RhDP3Si(Y+WHrFh#IlHH,-r?!ְ1mbBpc,i޸5 Rf,w&R` ` INE;#b#$ ȱ8o̅p4'e~xz " pTٿyœI1xϋVx¬ \L󉧑bu(Ae`^czHuoa]V%309,ᄍ4y q9ՃK?=FRn$Wf07:|F7J^ n8NV6JZ%6;|UymmwA6Iͭ7CǞ<_zp< twE-Ŷav6c"UT!`E;Hd <WY(RY$V*QMgZpۄcg)[c+S}si^t3"X4a*62&DXB5 %QQȜ*[Y EcJ\$RR#1WV+`tea @7RAGNG` Uxt@3<]Gen#g:#:&A=~yG]0]3Q,`ui$`!*rAN5.,5g7އXD9Z\l0_Ӗm?Ugսh^d.K&P~@rpIz>MV}Dx0Y/2^1'tx2\ϦeH-kY\YOQ}tp~`ޗ\J ҠWmLTFerg=RW VOE;V5+̜I]B뢦K;/~Ui>4z F Br TB)784%guP\t@ BFF'T|PuxdFf~7C]Z˫UrUB<ߵl5LFfY"[`={BL'&ń ļ#&+I̽ƐR=%1-~y,u6XcRQHE(&ibC;4hCJi, /CN?;{.#zp6-ԃ۴vi(C(inʢ,D,jʢ &OM4Av'CM}ˈ:cXAﳎ1ҹ"KaMӗ{Jw*f+l3Tکd5=O,s7gL*@tWϴ_Uȋč7xһ$y{ jezݳ.[L[-tC%yf ,8{j|4lZپϓQKZVDwɜ!&Hк<ش܅0PXghnyuUUpeӂ.)mʑ@⺳75FF|=VWoK:_J-\0­U HYD,wc!2pUPd=Ci}4V| 6,3CNcKFUk˵+@\O!BD1N83E9]HEh1:*`K"g+xu:pIUT0R4FNE~CXVxAkqi$TpgͽRē5UGw@P G P') уo2|KJ![oROH}R ̃}^x> y׭zHmwy'`ukI][Y&jq̩cH~sm%]qw1 3~,C)r\]ʼnuc>m}'o<6YB9_dq\<^}+uwnjwZAlSG}e>_&{QoZץYظXH @[Z%԰l+7|a i>}:l\I*sS|¸` +%ritlfZ<ЌJu/]FRPCe`a{r 2\ v}p!zH{Ӿ FtzH{t0:VGau!̤s3|Ꮾܼp9 vHAZ MUI [qn™dm9]/.ͧzW̹+ җw)I\dّ'zgn~Vfrs/NRW=Q몧%!:usn摻n: ;vcIoȻFNIׯ_- #Yoz]}3u3wy'ɕ.ǭnZ&^_ܵ]loM/3'?ɠ]^Q[Fhv zy˿% bc*lDpDJNIE$d $3,V8s~rrLx/eWKiui۔ o/.sMqWd+o[gf`aQw|@ `jSL4׫@i yse`,gƄڱ!XRqY>V ~9RFV$I }bZшW"FJ_i.52Cư ԅRK,QQ`:速a~>Fx8ЁT #ɇ 7XJ Fr`zba,.(Rp GqVZT#Uc9WJ@((Z(BS9>*lbrq.G!.%\eHoٸ@z0kzmFT[VhA*ki,Íw$l0Չa:k߾qtBV(VgCt%FnS~XYK!öR`U"AT G54^k@\ߜok޼[~Uq=Eo#ƄcJSEDLrZe8 Bh/>h6]ױũjďʬ٠^\A4ǖ ޗnS\R, Rt9 ׫GUa!puu^QI^$i6/Q0dE,4<$+)Q0]?-Dh$82a %)t@lٻƍ$rG㰘̇.d6`1%jHUSEI5'\d[_uU׫p8A><NEJvѿaT➤UUTm˜}?0J 13m^L`I2[#1s4%cZ-G~Q7G$ן^< 7yxu#Y*QaBLBAlIaXl kA׀bX?~V;h3 [2̎pE`Ni NRSuQ qD}D16ېυl~v*^[&5ajU7e 55w CJQ>gz8"%pHaOq C9GNEk{kD(qBY+(#0ֱ1@i 2IL| 3 Uv.M.mxh|P :t#^KKh 9D.QZ㢎: i+}TK5h{sG {ԯȐ{|CGCs<9+kIi߮Fo%tՒcWxg+Y3WO(26nf([ كy RAU-)6𪯽 X=K18V!gW^IP(0l[*k]lNO&6ӏfdCo+yyځ .<Վ*Dv 7TQ 晓V9I>r9yI> L9?sҊG(( &lmW# w]r[pLcV+kտS$;%LB1 EcT&8AEq LD, Xb)T19@$(~R4WN+ `^юDiD._c-2^\|r^ZWX^h\i' _.5zF󖽭S}pFA(-G|=Wz&9 ъ(,WP!"G5Ĭ"%XH,/raf!`tjDp$O?G,e)KWNk${ GygZbXnnNpSqJ9mC{~GAbӽƲS5.pFx}{,XNsM-tnf/&#~W 3џ"jtOoN40Wns|%6 !  AcqgWk,zgI,+'kB)히JKxl]Pi>݊oI»)$LcN<`7GK%߻󍄥alMcOR5gOy񰸱F(~ DēG/* xbpD$E@]]ų0paNN9 ʬ05?"1/Yj92}ϊ4,ӽ(N|ؽ[qCkTn7>Ўl4ܑm $vk셓ԌggYrBq!W$JI^@\,>QXqֽW*?7~_U|J&_x k|~ H y 7h)Fp4rWJ Q?fqن!~ AT@Ze'tvyUHJÑ9gĔA ĠG3k;_?!{ 8I i_KW0/_}O]:~3@;'UXrnrlO>a 'Ҳܨt)tg%-R ]8-Yՠ.qCS>|ե / S7H*ruZ[m=$X.޴SFjǤ!br֚%= E2%z/AϏiYej#>nL~:6Y"KKH. ^"}4l[@՘8)N2o;ϳ//>9:>uKJ> ӽ| Y̛|?:v["IT{u$!WyyuV5a,5@#!9]OO$*n/J9ǽTaV׊Hl|z[7b"1.ikRRÊߝObQױ_,[e~z<$vtb jqXCaJ8r;Wiz :zRAO$*-^M)>"ΫV{6X9bKN4Nj)_NRml3NEO5Ik2,Qnl>Elҽbh4bz_lhj6 ^ޥMpn% `I|Go.WBҎoxM`O<Wlk₪jN$asN+;j w׏%ZUFa-V!ɒ/eh"̈́$}Ѹ\ m9N=U6u-?ӧSfA*;ooxC5b3Ųx-K 1 ^ivМs)Wyz1W|:$"&t `ν,%Ds!8V^݃9hz7MK_1|؄_{bZToO)*Y}8/JKd_~ez7"a%\w˒F^ybŌz+3]6(dC`}]nMl7GT0F :`LDI*1mأ.C :y\llt9|bqd25{j^pJjM4S>*I_VZF?1 x1{*I4P}joiAVv7E 񗲕cP"5^֖|TQb496D `ɐSV`6=Z;ꙮ9{.?˔HU=N[x0V,La>xOd/N|K E<{YfjWѻqah*$]?ɰ|\э x|F1Vdըށ[|q_`o(vcֵ).o"̆'ڣ1<& ԁHD5sWll1e`ߙ`=L[XSZ|jOqlп6LzVwa&+O |:,4GJ|NJ566lt67.8\^ldſyH}_AHqo^j{Ky[ٱ}_}:/)kPk+jSBw1v`aw~:)Or 1Of. Flp/n5# dڜ%щR/[V).go .{ɽޜrWS}BpVg Fζp~SF0]o؇Jվ;%$nJ_R2.5Wehu`JzFX:!\-p+3U!Lw5"In߫e9 W bu*R -cbzAOC&;"Ysk1w72ęp4FxV]Ǘ;[T ³0Sbzu|1SxVj{苌~:ž&G-2)v43%(ٔtY\z>{ L |E]0zN,H  \,goY}d\ZbqE/GpOd n B~[3)/sF}m gS3[O*!kdN@w J&5ub6|1QэO LW9ʳq379X`{~Ǵb))׮Y݌,2t9Ťe)|m b(ie*H9.!=oY3t36^5$ݱ{tdkg`J1~4ߥ PD c? MeE=@,>XPA=SfD-4͎5IH 02q."^hE@wzU:Bܧ[VH7>諄P%8b6hʥҎ^mW,-3m?{Oܶ_A%J })KJ}ٵ˲.XL(#@D8(UL 1}MOwOO76LtJFW X7}>?8-QS_`HR Z"?FPDfHW>FmA᤿<.Dw<+sb Uʚh:Lbx6 awV*Ʊ[nʼnt>>)&Lk!=A2l|ؿFvlf珦/L&2F{PUCtw,~;;m:lPΙ-ol"eO |28O!{03ôJ1AsXfa2s8zGj2џ+ EAt$V\Rqkӂ;dՃGTB)cq%ɖ/xr 2H=ONRB d"n*t5UNBhtZil^47Cɂ%ň&2wO6) ~?uVQV,nĐ npz:Pv GKKWXzFjvUf.1w˰fZ.o֖Qv5,H0Ԫ_Tn!M=7R ׿X7や*h%&2G&hH~eUQ 6 -l~e\䊘 ~ )n9Q:B$@a( 5ۈqY:[o=w ܽ1Ʊ1`Pj#( aFDjAHXRQI~[ r^ j8UYK%ѐRuJK`=@-wFM<%ƈRMep-ƭA:<)os Nu U 0ÄFb-'7ԷjA q#v,}Ey *@6of$Z(O%DNދV.˝\?ATh~FBmôkx<'[ۯE^N%;4fC?O?.(}ﺠ]~ Jأ~bE\an0̸#UXruX( .BnSX,`moT .]u#l2[hF̓9=Xdۇ* 4#$70btΒZ8^6d껆L}א2 ⍕DXH]Y>U1fhx@I#B cVvQ:Q%:\fn #vc"+?8NN*_E7[Yzo-8]ir)wiyxrcc_g\ e""`Q*N #%6_,D:Uȕӕ3AWj`_t^gj h*zuTIT\ĉ.$ 0܆T,B3F:6~]RS\ c [bj='y|2^_ctE\X@ f8}E3u@~ǩ vDZJ]f[S#ud$ZtYǚxUSH3kvgȠ}cg::qK8>Rb4_-bV̅@f A8ؤ 2TLi%И0fA$FňFܐF""EYE@ (I 2\J[Ƃ6;L\"Tarԡݬ e" QeI$Q[Vw}ZסyNjY&)Fe} `qSi_\DV uUQB\)1J?";y~ϧƹ `@WA%^^HWi%=XcmVnr@'?"]/AۻgoaDBv𾅭 y޾x`:gLzH{xC34w,F[$zRn094;tNfDUX!ڨXդS)ՇEl9N8PD QTӥ"5(fxzьS[,9@鰃!h+TO>lZX[G3{US "Z f8WXSp1]VIՄ_!m7- TкܙKL,׌-J'Q9(ȎĮHp1EY:Ā}tʨ:ן>vׇ[߷fyktVZ~dQg~r\sQƕ!Wi /Ǒa*+8`f?rR=ή5V^7;}1}сam F(,( @@ˀ"fBnuDQ rXჹ]9:i]֏%S,,x1x'gow9N\Îodkmv d#UX@zks::wxwwQ|J?y. _ܠƚ Or0 ̌`es93 5~צ9J 4F7ٝMMk?diÒLTȚ0 V \{tJŧ=A#!?{RI@U`?TNӃ1Pj_lp5y<\<4:OG;9;޿ލ/lr!K8wc! 7 &sx-4WmTf4l>w>IrnTǩ?*F)wCmnvnBծL+"1<1g;__ϝ5=J#Ljż21DYX~$'HTKfQd!g;~R,c'MqϖTu 8Ll+Y57z U)h䀭eM\6yf20LQ6n肧6*s<[Qdr^΂̦z [yC(7ճq3I'c j˟+8y&kr l)d+T0ۈf>;>)5%fh~^Kkyqr4W5mz9kB6 K^ Z.G0pM#TK%à>D4]'_mJKNALbIK?@F6lB7̵X޿ h N*f{yzubK@Rs5B,`h>pIK;;;7{A8aHYRbËGFII4 kRlPYB82R1~q%袿mҍu(eU%=|Z" 7+=7D'ߒYPRj_9h4u(N">\-:u8U4qܸaNUrv@P͵3m$LטZ"lJZlBVxk-E0@A`#%<PKXHBb*B>W)-rW̟tRXIJ[)Q<5װ>~8. @ͅX}ekr3ȯB:TA"+$5G乩l$CX46Im@Db)"MLeaoJ |EKu2lfNCz1u6J$&4s>\0!덎"8(ڟOk(w« QYiӤl-R=B"j-YܐY@[^ X6 G{"{ev50z1;BO~A䠮,aR j5c&;NA)"\Qgtٰ @T}:0 4Tn9V0\RIM$){ܷ:>B:FڗFġ Uh"a eC*\QbfˀpH1DY,QāXʑɃf瓡 uw?7S_@k5`{!RNNxNػFr$W-r> à4vvгk%ߠdKiY*3żjR`8Ȉ`i&o \x?Kn"xƱ_ߗ0YU.YűNV#h,h;` Qc3AƑ< ]g m}?HA2G],h,B\SdKu!+Nkzm5,*{jv!* oCԔU #8 4+"E":{ #%G [ˉ-;칕HWS#aL(i*w{9МyڣS pdv=ŸV!`6kmL.*h)%B;-)q2!3LEFUu8j +8YPr`!BRa 0V(5V 1sIL P!VT. :1,܆R 0ІRKk90'ݟ$=5:JĽdwr$SfIk9Q:paVg/ۚj5ÞU7͸fbK6靂[XH' ovY5 X aL^(=U(Xh;6 AX̼eNa)7:t2=yN GT9= 9 ;"jk[j벐kVO ;Fڇʬ:̧l`,NBcKlK pn4挼;G-Szn.~Kw,w񪅨gQ(W{EnK{}bLe[e(%3T2;v'xGKׄ0˚Y+mA%!4H۷jTabjN 8N/K&JSfsh4*ƶԾJC_<{itegR3&}3b̷=9[c<ڞwTlO7#e5'0N擗KZ%YlwЬ56$e9^x)srn;{=^ZѐXs9߫Ve=^Zkۋt+<V>d,EwSjsyFv7_.k\c w1d0gy91ƐMbx;KCT&EeRz Pۯv 0Dub>Hvb{*D!b诣'\=`ooV`a$w-ec=G,Hetz+.w{ŝN⊹G 1ai#ŔZw7pm8ʊ G6je;X6pG yKq`N,Ww2مTB5!0$ah`ƎN\.& ?ZQbj6PK_)oֈ)tWb55Żp17K԰䒿""9)4b(V65{V &8jPs Z{B`=r%R^TjQhFJSpYOP~M&Hש`ʜcp,n9ytZ*sD5E0eE^h7!/!(! Hɉu#ܓ~>fwxsm'll)x%d)gTKͶ}Tz~?k+dO>ux?.7o ,JXr^oe2'dWeqr~__ x챸臇%zPJa^f!Z)Uj3h%kGVʃ蔎QG$Cij&!ZNr ,ax@רP UoQ1/cQ=dZzl[1O4W177"L= HvV̛kJ*b N&,DXA0WQk$B+dMR-“*ZI\IKEAbskQ&jFL"Eh \4&bEuJhlS~Vʧ&yt5_^_̱H'k4.{S"Q0umnŲW B:Jxw۔;lxNhqY&X$mHyfy|"%EKiHϘMfDe=1s"p, ), KX FU/ʼ ֈQFڳn?xA_wa_v?+-ZD}J`taY97 e0*GiO6`f?g+@ŷê)$?{[{`R-P>yx`]sW7F/? ļ܀ ~مOqЛwndpr1% IO^Q fv.Ok>aMFxv KJmӘQ 3d܃>qh.'/y5Z֚[vn)µ֥Ae=؀+n0ojT%#jя\6N--!]*M܆ya1jt[eVוw9ɿim\eW{Q@4YrF,\@46[hKFR f/.$LPQDOAAh Fg#Z!>j$o$S*VHR ۞ꔴKpE|dHBOI$UVW^!+N BIjH)G$IDҳ9%X.bM7i,ŦAѓ5S 嬁2O>p$XA:DE<4'7МB$\*d]gbF}~nNY\)zm-ۻQ^o{s|wvqιo)*ڿ,\ݰ;\lx0O3TfHy!ooIٕ9ŃvfG}oח݋Of|q3W_$\pxw@ʿ| m/zZ!y[֊>9+Fɖϳ+| Me9qUُ%0/) m|w_tHԞ̓lVW& l,EI%`6H}T6۞cޭcwЫc7ui\TsaH=u{i0ok'M*r6mцVs%☵oixYY zix (a:C/!$)СC? T I ;Z9kU#A}xPcBlAI:^_7 gIpB("T{p7Ic AH;+4/*-} a-e2raSSTť¦Rԃ;{4%gF(= b *qQ-Ԭ74ZV a$b_QbYB."H=FUehiT ƒCx i.Qio19ܰ'ߎnJ@PQ9hD ʙ\|=j"E6!0Ac,AMF7l'Q"]TR|[Wkvgh|li2l{@ВO(D2Pmnu;w>t [0=[8oUFh&Lp!*jziNғ!^y 8Gj Hj7n'As"b:pYC>-*v B:ѝS> ze,'o¹>n ) 2)'FX΅6`@y` ^ZsT$Xf22;ǩEE9N/ߛ9S_?s3!r6 hI$A; :7`IDjbh\H`Y:#(τQD rg²EE\Y kM.*M?t) /_<&,zW,] P+,I6b0 Ep RQ#&:-q1ӷ #/Q 0}7,ܭ,gõҳXFk47WNcTz-F*w* hU)L) @Ǵ\Q"PK⤍2'eni,"j,qvK54L4@Jlvb,4=k Lw|{E x"cDk'c0W!(@j TK[^gT)ea*yQF8;PR'}IhZk*}WE(CvCe)CO-Eh.=YOj9Ž1a9rXkM.S"f1B$>]Ć98 8BɊJ>UE$=jD F'}\<5b3TJq`UgRcj=.ו*7\0B $D#d@V[$eTED\`e1D9p,V'#0u AxE`S4+qA{GRB8 HgHJw1/}QY,}*jM)#Ւx ;a6ŜF3 쏉eMJ'U1Y*PɁy6,m-*E"rQ^(0m>1b,Msy,kPvABnGskc6 h@2Z"`NՊakŒ m/εBŏ\Y\|%0]CM];'$?w>,}^9FhyZa.`|Ýf15?%ÛJp {vJV5 ӹF;a>;)d:T9`M 1tR9+<8 PX0D;.ERo.r/ӑbC0 Sj'E"X.QYfGk 5kXg 1f#G;AY/9䌢@<ʣ! Rij0qTYlPsad?Y.* z>("BБDGoe[$Ae*7 oDN2pKEhH*y)\Y=S0d>jp52F:gfN"4jڹ"K_)ZiZnv0Mĕ;7%U)p+c8q+,d*JGEjE5 P&aXA#YAQB(5(ST75 "}("c3h}A T5Mj2=!te9Z),(gPO#lr0ڭgP Mk U㙌HzϠ-@HAH3Iv63079E M3eIw9CIupbr#kB ]xoSSm ~WUW Y(d9+Q͂X.T^azTr* Jj*8R.`)ZAxDn<*/ N#,F6N=KV>rўrQ0Pm%ҥ9aʹ9 \6%Ts^̣ \1@-37Ic A"pf eHbɑy S |B}}Mҗ0ynLV"hBI#V5{iھB⥑=i<_E9f:Ce{iҤOɥXv%vFRPBI%K{DmӚ!k^; qNv]7wB:"msGf"PG\`2J2b RiFDou,ME 72py=_~=|Y#8Q[TMnO,L0yw52^v 5C`H+\!p|52o䁋#5;W|`:g1]'Ґ.^s8x)v5gIX% ,WuZ5Zy|ȧj,4za%1IA!kcEX&8߻$]YI@c(FF60V#b()IXƸL[Tgtz79uSء&b=9&!5ArV`oh5A [#7Se:͑cL,EZ֋Hm)ik'fYź9\ROwO*0WL[wnYy.>^gkk7eCJS5–KLc~ȏΙg ɼc77oɤ^qw?AA 'Bᾘ™ʻP1wyXWI\6bncXolI\M5ޝYnQSqҐWnܻܙ.87F{' 6՞2#߽鐕GV9y缫Ȭ@V~.a҇k1L,Na;U^~zp6&U9ډ18[ktWEpI 4=tuF("ضD#VcpT޿᥹:Sxv'_wbQ'vuǣ3v ϱ*NCnVec3pk-kԺ(QEe58G:B:/3RP݂W(X{(U/B5B] a78${9RpȞz[W ݷ3,31VrGd $t7'q(5ؓ'3;\۳g ,<[%nM1cˆ!BAB9!m$1H8ei*3bD7XnPMR sc4{Z Y۷To9~w[ry&b= Yzg#㢫DXv"&vE+]t&ľ/Cn~WxYٕ O؁]θ: t_bՉ余īB"dX!UcH&"[#dgu8͡V  ZQL>thN}[7GxĄ@EOWxat$ C`mI-CsܟG?8 @KruNSz#>Ds?Y--2q&x,eX V_zfrl*܎*(" ̛(W \\H5)B#0̏%pd,lBcBRK/%̐Oq+- Wy-Y`LkdeRcm!ǯ F(Jy֭ܖ# "+zB=*TH˓;u׷C.AېC|z*W"Y-$bGu/;tԹ#MI3QK/ %,\n:YӘ)0:7kȫZn0gfrӳ󧢤˒ҋE! _`0 "`1ƊKA;Hz522/T0ab@ rN3E}VJC-x1z3x '԰4l䮼R8gNiY*񠉕Zd$M,Xǁ ݠf*U6?5{YuiM7,ni]^Hڨ+9e3Ŕ$xgJTF[qh}Oҁ,L^nԪ.dCid, w#mxt^dխc|@h<|$ᕘ6K5Q+Zr.˯c1Hk"J*{ q\[l&*Mr)ie RxNG^ DZZi[fNk*V3f42nj2(U 5eB+H|Umv[>3HύCfT(u8R(0#&\b 5ZO#TxJZP-4 3)ǚ#60"> bTXYՒTfEޠ(8X񄽶2Ad!P@^Rg"ڔ–B9LwmkԃU乃ԃ1;ck.mzp %0}7Vn-'1F!u&BFTy])NjݯG (-a /x(18B^E ;6DҢbcWhV"rksZ1DS~b_xӌ7U²L;$Ectʛ?DU4_IFKӉs4*N'`#=,l/B-R|^$Hy*_pi Hl0LHSQ!0ҖZn|ԸE&bqC6( LCA)@Y-,ZEW$`4#Ml%:-*)IH™ya3eBi| J[T%PZ}W)4"_ d$"]/BÄBnӄa:+j Lix+Y%ms $[aZ#ZIp7 )L&#VK,W'R-iUE0J#TEPWXM0(, -Hd2"30ᚄVA0]bhEA*Zpnъ%Q%ak砋N M B;8z kъֳk79uiTR;v\d])yr=^9@8VW5y7EZ]RɭZ sŻM^~EZ~Knp,'މ;/oqS:^||\ ƷTwͥĥĥĥ*zۂ>.%Cp`&M%{,L2p?a xI8BFAyyx!8\UNl/<`>)IsEB8#0a |\G.ܖ}]d%ʲlƹA3OT(Fy XZs[9<n >xlop:=2 w}wg"تa.-s~O/r(ogx%Ϧa|5m@YU@K>,kƱ RLwŪkiuk%o"gB8zs͵rf]m͗Y6qQoQf 0icI ܪX`Z5Z+@B2t*kJ؋xߊ5Ȫ=fewW0Xr_~BH\wny[>Ci{)XӅ-/l6Mht?[6ghh[ܙŧ6æ>KE{Ѩg/̇,ΖigiO]Su>{c;mbޞu얶Eik0sڏװ+E  |yuSJ[+ڍ굦ZѢ[<*qĪ#*1ng5\i2-!Bq nbWGisI֟#/vl~9yHF}}V<([_uvi}wY<44\ʾSt\wDst|9Y/=' >:ӾӸW,4OҰ; a\u!Rίgݺ$"ϕ4wKߥ0ѯt~XjpD=E[4+HCJYߟE1\ja"2hC`LD?`n#Q^*G1"X#~:R$u嫃GqҶ6}qP7V d{y)m3\9"VQadX**D!^qwWSY`Aa-rvWjVN'}yp,f і.'E-3K-~PC9L`T ^ 7܇m)Q&wA ?| N spB;AЂ!41iBg~~{[Ί=~.F0~|V|.ɿM 윘bj1I"2-r\,TPBLjݲ鎍Zt;]䃢sU#o)^NV"?XW>5؁3 RzI)Rw"'T8YD'Gh&QګEk!2rZhW{{bCEH`[8چpyRÝ?bC)N=>yTuKWKj ]2QN]}!B1MAVjRͅ~ r:OG"ӑdCحP埧-Ȇ>]kRO7Ju;h&XmRe-C A zu\qk< z& ǗL>m=W.Qr˗d[CjezMVTSSQ@0Dڻz]##ZCZ*lBfU=W!`>jCWgs:{ܫ[_MTKQsПUu'm&ۢm`C{Gu΃OڒC 2+ߗ {<\ y4CΚ|4ڤK6/Y)M i7,ttuKr?f+GU K|9[r)Ahoi_6,3c8x qeRxGeJT"1^>'ͅ Vb]t b<-`vFtFx\z" Vm?S[lpᴒ1S ś&c?KDa,&?z@gl,_GMVq;/w3^8͞0)[OS|Rii M]n5s)LQ8EAV!ZXc9Li?l><<-L79PUw&7pOܪ7Z,?ӯ,t3hp|xxx|a6zOn iZ\՛rHkIg\3K?Gck ba|y%('Y*,'YC !p;h V++$TbTdfYj:Q8"Z:[۲ZƄ: ֧#E /5a݂q_o(F*t}l )AFfP=QGc芯HrT0K3% ӆ0aF2:sr5HPu)Êu{4'@(&hnR1ȜgB)D!vXRT,A4Ap c(lZPGu9 qY#h 5I6+%N(<X v A!2npRf\gX[,)1$LfUP[!)dA?~0̝ɿuCX8u*"-ު԰bDiJR~(j@3 1F?9Dƈ+:`Z < @m)-ŭ4B|0,`-?+l7APcLnlGKQ2d\N3Gs%I#[ޓ 7z5O4V:3M9JJQ$,6oP\1I&3@e39N39lQ Ԑ@#'gRȝsȣ _jE[YQ(6#C/ھÏ h*&CIShܾ asMbVGPs'XHꫝmr򌂝Ϧ0iNH¹,rNKNrN_Z ae@:f:r8%i|bt?5vL&{}{=s8]ꔼR*c`p z@)1}Lp,Ea4 deD[e ] :p8X_],łEB&kK$/.AͥŬKkk=󲤲PR(]j+$/6^ Uą #~v 3/.VKNz>ֶImvAgv s s W`lT9[pr"A8?ybы kfӤibXak%j/.frⰀe\ ?Rh%X;ިQKEWթ^|w&U1Wq@ocU#(ߓGT=i6/ Fm56گ㚣6j?kJHUԌa]MTԂQSBMVqQGG-&9Z,@]1cZ)~> Ad0x؍|>ߜw&Th<v1^?h gdc1 DfoHZn0.P?|!DZf2FW_EΏYP^缷>AyL+=Vkj-5|aA6\': ;:øK$#"XVjpz c*x.{'"QF1Y9 [|`%5{/bX1IbCS4b f'] 믵0DcC1WX-R "s1r.vs8@@KXn&w84wAIƹxU6]JmJŰ,B\!V?Mwu7-n,cEț0-$勑f9d̩2 F*f:Uq"!%c4Lˮ T΃gcONx6LU&)2zR1,R s P3Ij ?! <^',JK%O},}j-c427XIEX/׸!NW R1%[L7 %2yXځⰐ@Jyr'Lib հ%a4rU)+02 !!@K8HVi&$2٩0B9'SnT Lʂ;ٕ5̱Nx&5jpUyg*0\8bl <ڥ %IhBYBgt6m!G4J1qOVDpzNG7YWP9ChKo 8SPV"`v 2֎c ):a6P!B#$_hy/>hfr伷ΤR{]jd~l?^3cFa}[˷#|8" s&f%+0˛xPG?ş),+1qy/D|Ӄ5m6/b4{o;#3Td n-p (÷(,0NZ^ۛ﫷y6+yHԒ>6~O% bNpRԶ"86C{eMv7abM 37Иh|bF>2ca:  ؚ̎8_:;ƱD ^l` B dU10gX0_=1\ˋ s,nÜ|MOGAݍҡk3dVi_P*[G>+"D3dY3_|OgץO~s>sn!Qe߿ˣ֥:̃)`R2)қ>OEc+|n,L?廅Sb 436ւFbЋ~<}t|Ђp2=[hk@ZFSHʼns*V0'g4~/ryێaN**U܅Rn8s j){ !ku[fF(`9T2l~Ÿar3Xϣ/u Wx*+0d%ACU ;??f 0 u_u&Q˨wr_QT|Án6ޑ*VSAQՉ#Z "MބX)%2!i$HRI /-L (BƜ%\kd Őc~=UP@:8hp+'5"EBzB&))qHaA V 0.L}Z'i52&Z4H` "Uc˨G(!p()u yō|BXe PoZ!\@DŽZ! y ,Ṷxm.IxjbX✇/ȭOuD.ϸPTk3.w*z3g@t! ?066$79hIȣu|wǍm@6l_ĎP=Ft&*%*U(F63(c!4ރ_t{^-YZ{Uٵ4o62k&;n0SzhoT}tz6;|PNLaJ[sc:{|M;9聒5:x|=:sZ*$UB [cL&MN+4EI G>1,q҄P`XsFKI0{*HPaLP@<(VgK6b~Bi-5Y+CbOx(fT?Q]\ʨDapQWAfȍTb4J֪*䢀vM^ymT ֌RoпD|8h f0]:0[>VH c8%LQ#q/>!ڻK$is|?K-0 * r`QR&;]\'Fq4Qg|>yZׅ?C٪5YEWv'ŵg.z;J? Ƿ0MѤbALi,toTv,QCQoL*HOe)g[_G-Hiǁ=[*j_6_Ueh'B;,INKkS 7TOx:"Z >q4FLʄ{1 bz*UYL˕ &X(MF8 'r%T!1aVT+vԨ2i;Kby_u^zV$8$@(<,0E&6QMx ,P%ZX)>O(T/Pns"!q̩ 9җ$H0~Y)"&ZbZsk jh ȱҦ KPgyYQ$(2Z:MS8^ɪOHc0;wȿ3k1k7Me]Es5n}753A7Kۍ=>5Ɠqz1ʷ "`2qk51> 0}LjG?j״]jG kדOfo,S_MSJ/*躕ZpѮp^}p.n'zհQg4trZY7I47T4 6=\ 8Zj<-Z(9 Pu S椔~aJP/E:մNO>er2g#.70R|-az_i6w YU\yE% y"H8}st=JѺ5Š4u; O7%)y0Ȋ֭ y"DxPqnZ7YJ<=ukAiGvUU̺5黢ukCB^,SX=_Y.YnD7#o,$ 55d0,J;e߻ SgC;uz{QæSPm+-4`gݜ*>Mbz"a#У0e!la {9rQmĒR!+?]SԌ`J%Xwmm3WvRg'/I0H\SBR&b8lj! }9? 5H-MTKoF+bRf:j,CI%ڔAGV(fk4Nq 0&W aoT$w2Ӯ&m+`''#\ʔ 6kZK,4ĺcXދ6ߜ:~}8;> M>Jf w Z+p?yK4 >cL[2!7N{6˛ϱ*[/$l)& [[>HnI>-qWUc=;5PҨ%.S: N^wa[sfp]O:s%[_Y^W1*fy],r|;xhs#)&X#4<לjsK5^;mS0R!6w|(Pxx{-֖:N7oW[if$emzW[ \hqqd*Dq8 cŴFHH$+F,܀\E7*We7@0ibB RRD42cy.s\# [ F%+R u'[Pr@l3؝ irXd^Ab-.X.̕raʙCWA5g#% l;52Ss_i,j tpL o55y- ABMjvkqjo3ouo9R uZdHw8&LIHJcs.BjE\ zИa8HZQX`גL -.·p~} ?.:b{ K>(gŲBȻ_CG?n=u{t&Yف__~~sfܟ9Cp;^_OqV!]Ln-+cۍon_ s'] 1}7(V8%S,O) e1< 2D2v9x@+$O a*Qʜ`-s&c DSG15\ߌE %^N1#H/peImRkTl ^CťU5\]R=\0Bl6|'r%ժs +&rϳ>+v% H8ɛX)DJVw+)'ob`Yi M`-mg7VsHkk rG_jJTi>73w7 7_pFҭU; m]⻟VE5۬mqMMd` |>Y@eRAyJ%HE0evn %AD +AacV/5m߃i /FW!!0E2s1CIS2o7YazUq(;dWȳ a(̍,V3rAY(bgG(FS,}טhN%*g\]X޽O"{)5/.?%|xrW[{*3\YlfSa|t?lU_gІߚG?\G#ĩ@8!BPƃ yv Xv[R[2}\^iԏgv&|9ĉl3n4{;o0z>p2x ~VWGVJtZiC&GIV֟u"ƊTKLy.Q3~4 d z@&91>ɎL+@R$[ߑM+nsӝ|JJ$tk?GHJ$E|1烝PjW _p9?;;wHUƱ([cfzzQ-;fKo{dA6;:*p}8UVBdc[K-6B~*_͗^i9RKxn*ŚvBpFe%W)Unur La(FԩU[zYρ-14,N,ݎby]gkmzj5%3q3^OhLdz%#yEm:s m:}z{Eb?Beַ3 O=):CC'es*TT-n]ӻRJ!^',,ZRݱYW1.qUqqL`)pQV `2d!Øʦ;Q3u'_(5-ۦ_d&h y7Go` *W(Ѓr>,Ns_x#Fb$Xc $, UXPJ.PC ^P^Q\tפ[8+?p! szf]lX0C Zxp(nvG Hؐ;" 8<_6i't9;y0{(FVSl`E1E `ݖ9* ` lX,|VF9MsUN\{g*7^uFڰC9$x{,_*,;rL۲׶&,gyJ۲WR;] H1K@_}$' ͞d4T Q9:mMVvW1kNAe۴-EV[:ä :02-w>Q9 i[緤]_&ϧ㵺8I*6,NXo|4T}p@T(O~;}v?o:_ЦfU _TWq]н|J'bݲu/QJnNJP? _a&aL-O455ګW"~[')};y ,,(^ĔZ, u'xKcnM5&LolU5Ȟ*'3\2Fu뤡|WG;?iTBrq2PD )^vbCk0kCJ`&2罟I<`0ƽkYw.`QHjx=}l8&뀠]3ʌ_?tpuuyaoFF_P#A0t`kCqH BnڶC6zs9sCwk>v}M3 bXcͩ+nr1WQ%JWUbɅ^ڜ*Y%N(S!4Ax)G.V;N)rHnrE\4}wDff(~n'rU ^2ËUw!(swJj!2~ݺ(]y)k){12jBfR?nJb,*ft;[ƹVd_fvtU[OD$* u &=`|sf2('SD\2) ~O0(8g' /_7F\`e~M:61hRtpe_lq̸IY6cu2U%/6\xX^+E-T״ۼ x~뇀Z^!6`H7E-HDm i[; 79ʊvG9 2 eȶyj+sumVt- x8UX徱;u`f-׀U;k[Њi{tR PZikLy Ȋx݌۠3ky#e{o)/#4#)!>tQJ"wEڲ>sUߦn,iPh%==gk%=jF+jQ(*tX ȉdZ821C>{jNʱC刔wD$@_Onf nuuڵE`v2pFI MTb`H2Us& u%'Ku)vr4aJuC-" Ȯ{ XB !^ȩnm5O0 xdvQ&2>'@Bz|F+Y3QYǽotDxcTGO"fJKvD)*}ZFY' 9II>G3M"vLh<4O ٓT2ڼdɀuA:eGX)/?-hJ\~^149q19gBК2F$Zlj-rF=ZK LKf/DUk9^uRF)PSpL"Z *yAZG\i #!wND"p_"FiHaJ() u t ]=+EuWAa eq7ݗjbY)PC*W[smt9iCDmvgZ[BNO/-F}_ jA>dR%z$#a}91B?ANv#a rrD8Jۖkdi 0yt4\CԀ2̢*hS6sU4/Fd\;J;er]e& ѝ5lGK-S A%>JlJAHHWbdCs*Z=g[4Xn#%'_Y/!6^WSc3sqZncWӏ_h]o \v"e+.j !IPR]W3{ӛoGmTUxav3JsY,|@K˽gk9P7 .陸/svO.y}~_nt|}=1 Z?[HQ(B8 -sh~t{tj#싸~Oy{wY>;~O-j.eMWMfON`ԤE@L,NZ?;_"yأ9IU$5dcu#3$LlPs>R""sE$ .;}-b@V=EAy՚ rg2Ň4+5BB^٨@ܺt2>HJ (\PQ0& µosؔ!3`b2>L|l>_y[9LVZN0Y{{2},khiU 6I@ I_B!&ERFD6+O?n4_5Mjksd!u&wu&ȒgN(ќu6i[IՂaaRWc lH`Gg%r/xWw .&%ԽPP$Ց5 5 Pf,GѨ*.,Z2法w5VBRFʊemlp.&yx\pg>Nҋ“(/?i:='"e˸w6mm2<TV&uz<+Jym=Mdf,uxmjX/Fz #$eut« JW0"wW͖}!"I>LZru޿+;05Rovx@P"k3ku8\|n-b{>S7XJiafj:,PT e3WVNmgQU]msT_99Vʏ+oDpU/3(GT`nIqJCGsO*P˷|}o$@LY2.@fqh zn[y/e# 8֤zTZ r?;E-&k>bJhw/I!^<2Y*=1L.\CҴerh Dmd з~`U>(n,ީvİDNY0Myd:"'A A#{ǧӪ;LJ86戸M!hg;8b#QO2=WIΥV.{Yx ۴X;n!/vdپ Ȓ/7w_|x䞲e+s[S4?glS2E?\,w場鿆ΘA9 \&ח 2˪ۗ}J ~k#و;ha e j4x?;WCVI"OxMAZf11e1\~[S[Dv\Y½VNKW(/)?mI^RF™P@۔^D> RAXP+2-/K}AvYt ֫-=&BA &AR1T-۝8okON!Vd+؋5Z8I[nuG,ҵ#(H'ʊr 3ܳ8}'Pr u8478$(*qJ|h⼐s2(2E@:NwŁ88m8%ˢNFsn-wZقYk<0zQrXu>˩K+M_t>O0ㄯͷlǰ>7~\-5`u N2}:X=iRyQJusÕ\-eʶtF#(_}eRkE`z:SH<ژ)fæs1V:q8Zy;q JQDgBȭs"Xo-Pi( }NsA,ƌgT*y3*2S*S4iL#$J_89*J4NJHuum-)ZVOVڔ] 3 Hθt><73 ) (:SxԆA1 3`B\x@%P8p3Э2-[ jC9]%`)ì:)?Lp_*"fd8*qT$"fnG! %:(&d2C5 s@s'r5(7}wDv#%RKE;t):@A27S8т' P~kYu2H;_:)@kPK1 +В[`0N{N?%>yp?5|9;Cm)N'5^/2_[?\\ m%RHcr.J!5-! uZb;$X^l= CƵEw{4n^h1dy # Y8%<.'귰<Wy?YkWדO>o:kfNMCa/N眠?bDr:4^CAʠn]K8.A=Tƺw]jrnFc#$ #vP;[|LY8%$gM^|*"Ԏw| q5TSK!"zC%ȫ9{Nwۋ|LU`ԉ&Ұ#2y W^Ew';'}wovA{!2xlԀ&!pASq:R l>KLTJBRQm^hXp<׌,bl(*Fj ٮ$gg:`(JgW8FgTFIU).&Մ1'Q1`xQN ĐA7u#BI C Xv2Քȉ;'hؑx6Sl E$ćz{kN[la({4?'6+03 ܂UlixQAضRbM1Hӛ  SC0r"!)'Qw' +\՗u`>{5,F?|ZXXSfCH/o./ ~ߐ]5Q0BMO졲9~iF&LR;^^JEl-مFBO! FE?,y$b,v-]~aOvk F49o/4 F{:`s_s. (%V D#zHޮP)%ͱz S:ԅ&-ߧڃP [/ΏaK^a}:h60:Q\P>bcD1ON2&⏋|OCLp Wwtl'|kY|88X}cǧuZỹ=' B(94!MvRnӣSq;Q+{cJRɐBύΉmƗ* x-ʽ{`1,|tk ϫ.?G!H>r"Gb䌢t ramtkBϒ =K&,г lD|t!5\pFB'ZFt굁!rMpN/*jT JcU"[[D# P_C8 wB4 }[왶zCum,&}Z(;/q0T '%7G"*j5% No*&3cmhGd@~jtTДIWWuIJF.u7WJMSoNgKR{кjqsZ+/8H-ew+-+)XA6:-5hɉZOhj/w{lY#ˤdէX[qO3.?eB=q7΢Et+%)!8)FH_;dD Ԭiҭ9S.VҴshV=SCqShjذgu-Tbg&QI)/Mp)P(Gt){G+`*ijP=\MMRVI;+p)} W;aUgJva}vnc7{Z\;'VWEg"et x]dpX,F^1o~0 2hdù}~U_FJ1o'Mى4Na.tUQv)|dHŔ DʌAFhD"B^aL3Go#͗.W@5:VGgGgGgGgMmj APF#Ռ^M8\DW4p u҂jV-̬rFC׍s$W߸~|L*NȬP 4HomHSvߌJɓt;'vP>[a00)liO;ϽaS\멞W+8o|g\ 4 e/A"}$2mt̏+ .=3" aI$ᥔ EV*e頙)%醢/KYx*a!8@ i[d0u:2Znh)#̥ sa^*Pwj|weFTCM+MJg8&R!bIAb敋BaVqQH06F*?P+x1uD~HҺ8]Y/S;A7] <:N>TX2}J>$n=t*c2SJ}a&9F_)*-~r܂Gpk}FBŃc@FfxDŽ’On "Ք %4h¥fXFK "`04-ҥ ^pz Vog TG&hJb1Y-6h& @0x+I`JzTRo =lELu ; 22A DiQXN4XA$HT,D{v k1kNjxG7_Z]eo9;9h~PTF`՗Np-k-^OIAb|D`*>_ 5TTe(YՎt(T'm&K?7|'.H$uxDF4T:졡pO_dQ(yUqw绹S/ *\nu %JA=(zrX:{5?eBVʟ(Kn>8bH,r/x2gu+hC8;W͋GyNWV]/U@' k-[՚ ڽH׿GfwϋUor`~=_r6,v`7BWWmo8u8̓CI1]C.yx'ǻw-4қ e-P3n>BAz~@a푿H`lZN)Q=;r0Ĵ8!zfo~",GU)Y*E Rhudz 5Ƚ~9{g4F0FU(B13U1Zo©k-A=91!1B}C?*NrWͻˇ(#mChMFȌ$*ʈc %A3 +_]$R7h*8KNI[51"x +b2L S $aR 9T: a$Nǚs%SmBaRラ X*3*ya1Ǹ^dp&+C|n3eJ6̥8rgS#UٍZVHM:vܯnK?!.o$½\&u6@t)<Ӥ^7A2[/~O1Yd]rffNC;uztFꥳ HlȬW Rs"^RĝZ`*f>t!Jx3o[d`KX'0kI 'Yݍ޷חVmí1]i#LTVI  WƚyնrZj;aNw,@rk)k?|$BZ [UTpiJ[tYA^(v62\2gynů|QItdի "ƈ5毣yDb=at;M.Iܬcxl\~҈Ύ]wp1zе†'nrqVKQy_zsa1]v/Nh;EUJ?ަR6-ej46O'.<} F㣭ld'[n/p]r6F 6a/NQ12p(<%A1{\߯VwBߞߦ,mjK1W}yCh e;<_x!WCr!'Q$8#L`idaE^ cМPh0 ^<tVkmZ?(G  ߃WG1rV H%.`BDb=3gz$z$cPSݔU;s?cicH_4!.L16JVL[]%Hdıp'fQTyN c (ߧGl?ưMqX/Ҙ;(<&j~Nt7BITߠʉ@o2{pq5=cgz=v/4>I)rQlbӃ#]6ETRFqo`[=)fZ93[d6ׄ"H3P,S'cR֝w3)Z,]pH~g{x i5xNy(|"l 'Hr\h^x =h``ڛ۸]{Ynt #1uN :YzaJF_~xgvMȔiX&b9Uau b4ߖ;?j,UINBOo""/ʏ`R6nvw7Gײl%5 sUZ)[iQiE*~]N'ree.m /xb6CߚK+B{+k%1AhlM< VdٻA .DBex 睂%Н+VSZO! pL)-S%>irMDBʹ,XC7_AycBTu Rs-dk/&>\{w[I1N d6c_U 2U'iQ_E)ϼVV7q?I"RW˓i3f R @` /;kVaU`W,ĕ x6s:p`}P=A^IF`Tc&VRYYB ~ɫmoA&Qfb ++-(9J ̞x寡Ӊ#(0(}fZ^ (.m05e!QZhieq6Fi9ah}s-q/&`1G&Du>qY(D^诱HBpE+}䷻}"sh+ioᒋ~QP6?UF\7`U"6A[6 \6c;g_ `I>y}c=?!ʆ0,;_\ѽLiz Ak眨WfE2Hg͞rڋ&=g|S8TAݧe5Ņgqp{&5dC|&e5\?Rƈ8X0ea#m9]8g bOjl'3`AT# @O3!ӃcSxH ) z&UL9^j}9a%B`4OM4Wֳ!%!tgIBq0L~xMk'LUY>H8-Up$)7ՓVs 8Rh)05xszpASJu):_@BY sUk/åt)Ē%R [|R"AX"ˡGK{^Nd"}j+tt2=⠤9d/%Q%pYSmk/Ì qzMH_6!OG!OtsozoWbR Jr0O@VCATymEՁ6m 3pgYU ۅ.Mx>@*EwS}vQF$e]=~='kCW$ޜg߆9y*w4*h4Gxp\2,|M7t˧eEJb6I av3Y)h2Q!Z Tj'~G7ocLe{_J n ^#t32y&ͪC[+f;Șb<>"YM& w~ʮ i4ZuQ2hJ\)d`ErRţhH[5KӃ#tV/x>DgZ9W%(Rcd|MZ+I;bu 3`ϻ}rH,k x||%*y-~IuOųֿ&#x{dap.A@8EkLaGm MtFdkoj#I_E?v7h1638 9FF! M_VK@uխ,*+ˬL̳ Ǩ"n"4CGShv0A < )2DԢB ^:ᥨA&j6`Ptc8ƹXc*l|T%ϸ޶1I\_`$.II,HM, @EHfpNJPduȠ5-nQ=exܩV ֒}f&:q=q8m\1w)]Xߖ-+\^cUM|^~0ϪY`i|Epc`7׬05Ѷ9b`0E<Kp|QV\&P;7CQ5JN GGu}yH~ %5=)XbBYvty2N$嚶Ǟ-OÒ5V}1:tK%Ǵ%)k e!pfTV']RMZ=K2<&u:Z O`0f/GS!5KEtCJ+7n35>)e|ZjК>)n0mOV1~rя ׂgDi+j.>+_ uVg[ Rfbr {gꐣnBF WmE6ѼyEX_TS! ʤl<*H/- Og4o㊱JmMJ\N *e*;ΆB!cڸL2BiAq^$}N,pq`l& %.s2%3l7 [>ЬqsKu2OϖϟR ~|/]˯.W4|+y~+z &Ns&*m:g H ;LjJчGds+48d?ND1t 4)QL.% (Iu>! ›l64F=` (wV)!ǵH u*i5(KsSӊ3)21$ R I3:ӢIstiG-E6VSJ0_<[=v*U䨰ML ta$~ʤLACR2>V&]٣Fa?>n ^a8uy̴T&\ȕ\yDŽ:d/*qnwn?nOAO8#Qs^!Bә >mHIڸb2!taWvVJ*| (oƧUMgDL# N#:tP8ITpvWLjb>ѫhMJZM5)6Nӌ2CIȂC1qVlIKl'txZ8L:.|X~JGA`5 QJse=Rt)/G\0 铭'FNKbׁ_^_2VcΌ0s)A(Q/8 l*:Kťtn02KE QbeřHqpf ǂ"RY6(wp#eb&A.".!0AB&FxTXk 8Č41 ?2G&Ji7AS:>\0'137rOɸyAzsh3iGǣ6KHՄ,q춘X-cte L3%1M1qL @pkiA; F Bp'`x$+f55u@mdto/ II0˜;LƑR3 b P :1>: Ci`qz1Q9y>MGPV 0.G;`L pBB:s/N8oN'q:9jM= .\h|gA**l/`{ݳ2g@D3"n?ɢڊZcd 9 4 t9('^i!9];tb?tܕu6N&Xr_;!o fmvEzڍ//adwQ+IAw3a\F2.BXꪨe5cTBAq'%L~xU* [F=3D5,OIa^i,hyϐAG”V׺ R׺/-a)LU \)ͦM@fݷ [\Q!.&)}o]>eX 0+Y[J@ Y s\HmU uՒVZ=a"fhk`7V.UYX*,N+VNGE@EgQ@B< DyC yb1xAD`NuignCպא^ Q[: gw!xUzbkO_0>996u xI<\|]d/U1w_.ߜ?oaiaۥ/gdTs_K'iwkts[~<ͷ|Ӽ=?00'{,MlhwP `OI(ln.ѭ ml\Y8W>ncxVmUn? B?`gol; NB}꾌N@$$EHݟG;.msϻ=?wwA7sinx(KoΆۓ?@ W'7''ҶtWxdfu{08nhg`]OF L.W[N.&x||G1X/[Sܹ?+p'#r۟ς h}k3$EHv8ξ_xz6)l&Wo(Lԓߗ2?37_V.s}-vwo>N=B?.[ @M]+@Rݥ0tg39FmcOr?{׍vx)UI@08>[R$ed}-Y:>R:LZ}ubu-W˫+i__髃:eݷ;M=['qcRTzO"XVWts58)Sm5\UݒwRT׹rH%G8',3 W`6Q!Z;Ιw>%ۙwμs{;/<_͌gO\y#ߌ4d0dl+& m 'o9mu6MzݦM^6v3ZdD/1Rll?O VzGw]}и#>!81st%TdsF"ۋj(*  sBe\ 9nJyٯ@adyWڻ"N 'W\oǯL%?7KtW?RŸ}wrm[dvez~ʉw_dzpgCW~`&< <#m5~Qwcg![":oʳm))mB'3dgڿ{#jr7C(cnoz {'MrmkvN ZES>ESe2fO=Np;{mpI;e-%" I`)4&-lН&oUJّMZ[U뢌6#IwVK%c o2a 53+e yZt=꨽٫WrHߦ5އ(g9w9I%CZ89fͨT˜,gN1,b51ڠ~3ŝ?w:̾rl wƽ|3q.!eM u.v-<;Nj3z>3U~%vug4uaCזCx0p`ǎN MsRb()g.WXñrw>1 );ߚt; ]dl,AbVbV3n1d@Jj-R$8F`vE- Zy(p QP"#* f?E^{'uRق2q=;GchWUw+pミ_l)?fl[R7v]47+#wvݜߐԺ.ԚcR`.vjW/^>lWv X׺)tO2A\k->kv@tk#Z[\kcȵzh324se?'(W ePKʛyn{R^0λ.aץzٹ5QcJɇ?qjCU(X 'hVU^1o]Ɖӣzv!K-%'G"\Qjz&µV 6Eyk]֫jt/F^,ÅZtZi Lvv.,tyZ`"܈[7߳# +}֬<E[!In(D1{{K`ܼxbKnxƒ{,?Zib3;Jx^0뛣Ov/ҵwͮ4ּ{7%,QMM/+No]SiHѠͰܡo@Yio8)wh&N_uHuʞkpgOE­uۻ=J˘\ܡJJ\܋:W?B+Ŗ)T?BQ QgRU›  XP;|ʅkgpk@j#ך WױIr+'?y0F85.YճW=W#ޮ%{v4=%gϟ}Si} Hf3}JVϴv3iLkwGk{(}#r>Ik|R;"!>XhIL${VRƮ){:5=g’ $ݥf$k]Jm*ɶ|䊯ɂKބJ(%B^Pɤ3ڙ>%Z LkgZ;ڙδvwvPۻg > jK Ⱥ/|uy#1 ;u;)3}dXOu 󛢜!K3z!e?L Y?<;xO>.G,^g?u6³3{v3TY:k~_l)K8kkMS;knpG}GC -7Rn>l~?ƺaHtQD_ˊCӣE9 \9Qa>{iΉ y< CZ`MxEZp5r Ԧ/4Xlqd 29|0ƪ|-5.jm~~p,,|]K{d7ռ=9;䎑8aFӉ-3J@U@!DUZA4>%pl%+zW$[K!KfþC:P*1Vc{A*6Ib 7vn3ptQ|  pI gpYeBu6T^WNŔl%I[6Z2JeSв@*r H7 u9Ma9 3B1YGmƕJ(8Ģ=88_.%"e $zk)SE3Ŷ5\*PrQ149p{IJPiي:_6"W2 ~@Yoc~$j(C伵!LYt3TTl,1XPSA}V3D5١͜ϣ;ߋ]44hr#BMCm:]&1$6Ma1g$LX)\_rPF9z$6q 9}Ңxb8>?DV+WKq_\y䏗f][oGr+^,0}K}0b/ A$X==6Hu#>&{ȹQK.m=]U_ץ.A{g7D(W'oNߟ^z Z $OӒ8Z,tUK9Ti bX,s?5[]]pڑ;mr6{3PY"#.F0OV9I;`*܎0h :AB/l {^(QzvLFciܭ @D96"iuXH }2mAOKszNl :Om<=iXyz/g!ԛ'3~b3l9r-Gqt^OL7$q]&P>t҇`ixhq)L,r zHƳF~SAmLYkNgn[_4uhDtnyˠdaD%a4\ΈNrI@!E뫷@ `  V$at=9uL`êRbV%A%k+4GsL>rjҀ7S@= &$q2(o[YTh;NHWNϩ1"k40J!K{Ov PGTX\ wcMS GS@#$,֤eomQ1xƄ%$!tPL°)Y!ke`xQK߿5FT&S<2k n 0_akVIuKN[=3&w>RӮe.o7jH\D$ G=Jxa}2 |ͿNCIb:' N1 2U 37ܭxr "qϲፒfkKTΥqRfܿ!=9r~Du)T1IR."+DK''&E!}̇e_?ufp$]ZJؑ7\kR,b}O}vmt1ɘī\ctV/ I ]ԥV'2v9ڤ}mRϩ! $tͽ+0pgsĸ0[TS/(:&v s%DdmT]ٜ⻀aLY|8]lyט6398h\9kν M+gH0 4tV/ ) uV{<^AJ:^e}.d,6HL,L dov_J{`< ,]J$ vX/C!w5hsWCoB ̏X nϧ1^͉LWzYTsKMyoW z":v&daP,HoD† W֓L0.ڢm@%uZ99A"&ʹA):(! a (GDTUPz9:f3wkڳj&cިF`uـODÐD""`H ("d9"hLF-OaS ˙B?7.g1fN4F3nXV4mAd{7:Cj앧`2Z\3&`Z-25_+%橐enmQ=a@k$\ Re:9D<(}Prݥ4u_4(!}(G7%ֲJmar}良%Nx%^Am{iN!e2`q(eHRVt-=24H[Krem%PlnFJSH 6iG #$AoJm(-kUV>-M@su)i݌e/FV>؀FE)Obl}/)">_˓vOK-G ,}"}ߩrqI~ɜ)w|Z >6}d!H:ygKͧbs{i }8:{#؎ˋ\_pα=[qŇ$[I}weN8$p0~^ճYt~Yl\=M3?%_ak)cq/ry:}4q9+oJioi?^a~FUUIoT%9TI ӳt3Lɦܺ>c\3u)&14g 'Vjk?>)^'d^isSG1Nɑ7O~>3t-t@;Rʼޙ9"U`R_ e1ɶ0B=^>Ɔ{UXm%keԿ#8\*hEAsq1dmo䲗r\(b2f=9TY,*]ަe+y- ?-iYm-fPSU7Nq2^pMl 7,K)'`WJHY 헱,_6rF)PTBCmB1 Tx&qF`bQZǦp0p? -Fmx4: m/b[v$ާcӺ'_fꢨ]݊ LqN#G &ЅhVm3bIMԏ8ꑓ?׉1zGGť=+~ c]/ ikFVewȉhvƮg?Xa,mC{?Nbz,>^ _i&'> hi@89 ['C!Q )%[r:Y"~wi̥eZIGil{3#ێ 6 =Gu %xi}ub&q#c/ٲ.,w6h0ɂhFlgz S7BU,6ƸCoW]=h/`qy'Kzӌ5j.$>vdRs+D:.vhPS9hyd2v7QeZYhԫsl4Z F99f߶m㝸BXzN z_p;AA.>SE97:'qpd1 iYVUؿ;jW&wՃUBP>-%RpنZ.:DQUH` KdMWZ]*_`OوXi ^pia$U$VUK#Tsθͩ}m_Q"ԿC\ի!q~# 8t/ߥ4-ɟ&L#[{lݣU! o| 7dvFfGSoTHN|qhێ5to.n k]7\]Jhx) }x$i?o݃Eo{<{oK54I@i-y-:47?īkײRhAb>(X5)kdûIF̯'o|݂L+v]G]s6ǚЮk3zs)/؝ĥn%~D釈E-zQn߯>Lzs/KlUk¸YYQYOɈ(=]g$z"A/H@O( |+S<p>/! ߣზ1Gj_!~~G4WD^΋w5ohg ). ^2q gtc7+ys+{;/f,-VrӢ E]39<;t%84`K>ی!lҧjOSM4F7ͼha4sة3=}컖RC %yO12޵]B1;u,ТNK?~a$SxYNbv煼mbDE>Uo7Na 1\Ph=TL WZ1!&2Ys»tyq$Ej-콛[}~d :~P6„1pC'h-{uo"Jvx;}|τ(qZU}v)6lP\G 㨚.1?~y4̈0}(%&Q =/Q-(b"E*cs!`yNyF=2/syM`,H/v4EnMK@ I; #SJ'X䷼[gc:#G0{XB!ΚƵT%j LQ[>d@YQwDxXeyHSx4J2S%G6T, r?Cx"#z_~YѓI_4CG>, X~ŽFnX3Π >~]޵6n$"8VNE$H8$OLJqפ$(i)d rwmD}ZgƏ,PS 0^fh_ Pe]9\JTA=rz~`/Fsk.#t}Qfz,fKʈJ(7nrxzF!arӋipLQjE$ tڃ2 }FD=N)g;hش"1qQM]{d[@Ky,[} Q4e>7""1#Wln7&doP=N eIu|_b!GMā@ K($| s\=BDzdha*~Ӗ!͎#cSz#m {XKSXG'* d4?`ͰޟX<2v<8+e?ß#v* H~' m-'2;dOr8{ P@pՑ̽GzmϽɒ8X ƽ2"b{/cƑ)ܻǐW&Gg}3@ FZT}QprYFSU) ; ۷KIn X'$I7IM4{MJo%`KNPSCRBxv)VJ &$jPR PtXuF1*J1Dޭ uSV;%OM%$W 86er^vGG\\Ϡ$U&(z& 5Z\rHJТ\G:BE$$96Gܦ=kits0j%gΕJڢ-hGIf hGV!mu\%@P>:^F0PZʡo 4>TKAh(47趄$񐝖PJQFO}N 9T9605;bҨ%#֨$Nr 4+~[<# /a ) +u[xk"{fÊ&t$:aZ<1< ~}َJeٗ|gt@1 ԬGVw|۾B'^纋]wQ2B0sY^]mtuAlQQJujK,E+/dp+ʺG<()R#q ^P\-bU&eo$=RUf"UFM~ݲ̘2*G"2r]pN͆"R͔Xx1F˥:/]a<)RaknAW2;+r{h,!]B<1Go{WkKvKU:o~i4~ MyYӸɐz`5^-[F5)|u^e0kQ{⚓#V<)raOZE\%ixmh ϙz\=)B$Oꗅ# l 4@@Z;RPzOmd*śRc hs xa~wLbZCjU-*UU-[+Xe237 Okވ5J*W+~pX}`Mcva&r\XS`(!) ]BlL1YxΏ,$-`+HNVwÑ*$2¬q$&b+S (+B+LB\ġenlq1Xl`[ E@(8UX 6逦)Z^ōI[m ֤F m /4?sApEKR #m!LΜ aUsDQsx9qZ:8RS۵ӎ QPCBsQ DGSL)Pqkf$}<@i{ aǝ$obnI9ń@\k`FwELfv>FviKš4޽UX2ќnLjc@5O S- S"?a@>,{ V.uT\YQ_.oj:񧨌ؒ3f,]b Y+V$9ф\9M;mo/u$j^ sc -5a]-Ue|^PIWȦrϽW XmWHr:c$2})aX`2[8w.EܹS_$)!L"x8+lW`![P/$ܱmQ. ph6I  +:Kz9Ւ8B7/"aI%D&xPqe'fsK"h^jV_.7 x$ & ϷLIMªV$V՚=]_D6Ii3W旋 TMIFȊBbzUG#=|&=`!`xXʼnoCs0i1 >CN7:T*;ë9G&Cu=ZXV>e _KWډg!aڙ\ Z$r/OmR&˷E3Z3 ͔I!BYDIxn$P(l\Ď! 2Z B5Z[BTb߷jaa증^bf,LW5:pnntN_o!a-[:_%1鳯qs@!La] NdЯү ɘ,*Cm9-'O2HIǓp?.<$7=0sv%LYS _BE-SK o51ĄE]@L2أX@4EWJ8/\Tdn0c\Fuј? zGCᣅ[[_"2/VbJ̄fq H }c>Ph:@k8uctDhPQi&#) 4P{xX~lsya0?H!-Hz[䁀3]ƎpBzXx&Ε'h!I`,6N yHlIٔȝh$m#k nc[j"WݐF&taQ86B@[+ !, ?c!X@񙣏_ ܡqޔp))jThf' 2]ӱlvQ@T^wO/ | XO|1DS\rE!>Pކ{w&upN~pr1ZQWFپ}g:(W>ٮe`nr9A8$GLb\17ϮCPkzow\\ lDR/O],yL7e@F4 WJLq濹XdǨIPDs]GZfI!+ 4`bd2]\䶼?^+f.f zY.ͩhzNO+ ӗ,K5=~fn"o(=v{w:{L&y2],sq~AT[pUd>"q]DZQ8pv:Y)pߌ퉒p9N' OĂ_7p`oޭ_>kC3˔J|7mt7jZ~Qv8φ~4'JP9uKj z`Փfm8vY 7ݻ=?>໼[q8JWQ|nR? 5vۙSߦd4c1R]E5x;M?'ch(81L7=)m0؝* |6PV')߇Se9_Y!$<}0yP׎[/"8o!Olq=xzR2ZopW%xw*˱ZԢwC `;zL~,t/Fv8NY~|4SqDc?yDp/*+3 G^WO}O?9 ~{K >N_~%˧[LVHn82ί;ecPU[ZŇصGӦMMyI'xqVɲ~+`fo#Jd?z\ѱ\[Rߕ\/W+Z4rZr]SU]ǟs ֵczj.~~Y7U"/x| aDa8 s,ZrPre1s ҳi"58㵜I o1VmErh)-"ݫDx){{nO7߶0`"4EЗ+ +m^YFp>79OA}4A~. `k{<\q>̛@ɜtAa/7-w(8_/QIu}8h* pLz?.7_OJ#mq0{ߧwWA D--:4K Ӌj4USg>M~©éo~a`k1mU_q׏?!<Cw:|=zآKuA=:i-0# BGx("^ƒVɸhkХ%IƄ ـڄ/bpz^O@|1@/4a]fd'PλI0 GBy'?9\5SN #Pڋ(=@J;G#?>ȡ&؃Vb9IE󇁥T(kII6`/Lx^s@Rι  k'^T p XRԝ2 e)r*cRg p|'€Υ;MpR:)L O5/g8A I:(Ip(p~$PUp) 889&*H'T?3d2#򔘜\ot5FapI rh!2 |9S |{Ї_Qu2!.EZ'^c' a|@EyK$^U3+[,t H᫏zg♽$$|DWa6w=˓7EOeSPɄOC 0JQ@cIsn@  x?;w|8>?fU$ah*j^{_-hAvHk7Ҧj1"ѱ6>ir48*I^dp+ j)3m "v:&ӌ$ )ڦ 5ڦo܉.ǘf^|@bMVZ%nij{/]('ʉr{Or/}p幝YT{5WEuf]eYM-v#fzl v\Ƃ:IZęȄ${ɤ"sJl"3ocS^DJ8'3Y85xνX&.]e`UQ0Qt6VQnO$Ҙ6+W+煬w>5kH{UIt VRkl?VeGk̘'&0N4p&u**DB|,xJ$2*(1na$*[ZU66m|kUD+M|eXƜG:v«ҔGIIUW+dXDQ.Պ@Y!yA8H$R :p6m6m|k\ ):s 跏BXQa]Hއ8pa/ {Ϣ m]KЯ1ӿ2y욢FD R;A %k![/v.ySu_غi".I+ހ7jP^&Ƿ,j[9Kz[ڸ`y;j,/!%;c:xl:v^9B _Fxў=?t|sV,[|s\&sղm<-喯;[UFrr,iqw)-ڽ!Ձ590D!΋ br<] gW{km?sh%Qu \Pa'-Z$zIn޽N87k!nw.›ɸK\i[Lu>)=d/Q=ɽy(a3SH;07P4y}Űr(ˡ<,r%'dL؀kHKVW̙w>{Aa]G5&'30H`, CtyShTJ  3n${֭llx}z[^9u^X.cۄSh&IIfޅs%l3ZJ Q I9i8R)04D$3{DAmza9GysGQ}伌cƚqLR;W9@8 8XRqҔp,N"?0$ygMc2pGdN-hsn9I!kJ^r^ӭm?V/.H_OF&D7\6  3ۣRamj&N~4f2n t@.13R2&|pM/p0XKf(3v`\P /8>Q<OەqE7NN'i7ن{W֓g{uMyӢE{ >^aZ]_߰g\\U.29&D!=P`Ј 6B:fK0PKkcLC/R1+S c<OIbTB}6kV7>w~zl7@7 ]Z%{mTK/9=?SH/7ԘosRׁ{$@bW:%g)-~)[xL˖bkiATFpW(i1НoOXbC"5Ũ#5`B> 新"M@3%QAD|ܒE6.NlJIS7ڔnEź u+SD,Fw$!.?#vp֟Wxa@YY3P=RvEǝ|7303~3ȡe[o׻-ɲz306d2AY G֘&szZ!(;խ}p.Pe~KixfBP-!8Ѵm;da,􃜚D2\ws^CX_.w$bxAaĶX*l;bwq\a=(L4m?]5RF;Z\a9FHt !umɉZǍS HTh1%=NN\w@B֥7آȼzsTTy5kI!eC<,!9}b8$Z~MpY5XZ^y~E" @>,QĊdqgAj6!?icr}Zc)IErZ9h##X%2Pp^ݠ2JEa,̎d?V<&>bRviay7 v<4BihN15;4 IgTz6/_"n`b-wQ:J0N&ĥQ+ka4xݒj_vJ5͘JڝE\]qVxoRuJYn|Krx6JxY~Jm$$F01!mX|?1\|u9Lh涷Hw.ÛhZ*>-O4kG;)"٬f *xbNB>_h;2Bwdq #IT\ere_T SS -:cYsJ-64۷".`ޤw^Bwv 859m'{aBb-Rql4wLFDSQ1hX ֕Du15["NLu 1r%75h /gvf3^X !!ٯ}\}Q^a#D˱28ϟUB$Bj4Bq֋@IjnAn D#sQ&qO 7 `XM΢[`ź~pϧOQU)"`o=UaۄL5醛kd}Kx޻Jx}V/Kp!z}̟L]c.Vq!!2΂l)GO'PғeO=E)XfU`S,9v@ΟjxmT9r  07~}9V+3a!Zeez.gep/+,Z@be<3M+zXC[ض2i1Z^VD=F 6{t5!-nhdM8о*AP>4{E YٜH䦮\qKD 3 ShA>.y9Z,׍JQ`V<,) l3lJH1 1T LI Xȯg,H` {; u,.bdMY+Q|4xFf.XA@61ֳ z?(iuz>];㵍Yq6tizϜH1 &*#f+X>qGe= u V8܊oe⌓xb`a.tr/9Inc9 AY肺mN#{t3꓎%-q#fBA Gr3OҢi ,zɶ@7t?mX$7Fސ4Qոn9wDZm$=u{CVwlQ獺m!ϯ ]fAtVxv#sN/ ,J2zo9ꂢ*+彫'P !Ӻ$( INsN9A*m hKdfTI;ؕDf(lH]9U/Z)Fz$ 4hG]*k9%%%'| BG)*EQ`O*O3Xivzdsz|dqvRBdca )T e֊#TIdm&(e uy7.{ ̃MVJ[y6E;$3c#$#AI8"CT 'Qr:g3RB!cș _12$ۺ:q& l/6( G}-YV5R Ntm[9CF]2j&EZ%g&vތJzDڭZʘ[ݶ1&& |tb;ig̒vƠ.`Gisтnm%KoYfzYgڴi~e=/nm(9-h;g+ 4m nۘN5Suj kV5,* LXU62w{u9B;y֞Ufm{bNfsB*uouPgY4!w >fv7|vZYOotL0% #)I<`'l-ntvLf-FDmL!yf puZXpq4kn75sqd(_A̪v}w-"Uَ>Qc3mRZUQwSc6ֺBmPnjX{Zɳh\hdb+{XT݅eqxFY(jBQ\Bqk L wH}mP y!4S>f@)$B#2({O~atTwlf6aF y`ևp1YJ PFVGhЦqtCN)ׯiRvdDj6 [zt}f  q MRh9 uPGQL`=Dt]rv`KIZojQբΫEZ-(}(͔̾WAR+G b2=8 *g3<]oA}f\e4RHr:4`V-ԩC8o}']]}Nw+Co+con7,0iƤ}Z TWJQܿrSFAnCU8y~REw\;"T[+.A.Hü1F"`nzK  ZeneX ­p8x rۃw̎gY&Z?UC,}#aW—'SΓ룟'ToiN6Aë)zX.N*y="}Z#' [oq;hKk$Z9y=~`7QI*R uyYP/ދ7}3q/-x(XK%LcCwjGcslBv/ - Z󻟮*hv<>tc%u8ǡ!k%'/&}w խL5J&ut2@l&k퇛\ *ݧY]<9.4痈w$ }wg'u}).1NgZP9hHkTZy:vI ioT翀PLy憻| (K.?r5%^/JnOoҜMV!{?0b0/VM 8ן|o3&Djw( Yˉ'vSzI܃XU5_| e1cq[ՁG./feSx||n}`]|}Sr_=5DbAu0 ԪEʘxB\92;ɦ +q , WB[0Yd!MjCsZ [6f/07$Hx3`>3L+trPFd'm0zgȔv37[Y2zkhΌձ6#ʼv-M6I5, F}E<㬌{,vB|1䑀miFF{91mjcԥ RQ\\L?')[%PRIt_5N`JVN'*C{A`2tkvܱ dUC Jѡ гDj ΞJBLL̅ +bAF> #=x4AUe1ahBF(b8img"(%Z \)4,Kвb[hCF-Ԏn$?XdJ 8m3$(鐴,2ل^l#޵5q#9[{*G:ueM^ba01E$eD B"4ݍF"S{Pɕ݋Uq| J$tNMaӜZo#ST7~*i1#vK,GMTz^$o[^cJ^LBf!{'X2'>=/dzFo/GɄ7OćݺYϮn7Iy-B={ہln)T!5m8qƏwscetBVgʖ)i=! PcW{b}9~gzG]ײݬ_4 HK\pGlT#{G1g}`H;yL}8Bӿ眿mK9Ik.V5ia7;u"7e!}5@-.K/M0Lzz:MG0K I6Erq‡_/q+QF=^5!7-]Z8Ӭw}6[{5Ca'±)+8;Q&7 0PEod7W' 5t?ڇ]~|/?)J?!4gb\>: VZJ$\S8_bM_D1|*eYV}+ 6N 5pes+@*XCd tqKfJXL)SnedVx?Lq9 s09u{7r?~5axݯEme$xL.Gr1yЛ؇/%Ԫ*dn0TCC̴4:tBD-ݛڡv(ΛpUuMms )U beZ(B &ۘP-0퐠RvTHP:fda՝:Pg3u gj9LRӒ3\!,mS 4yL=Hj@raH~9Bvۡ[dh[ܔYD+0m j)N=UTs8w{s Jx^oGBWwvz|'Ы`0b;`g`uj>q"^[<x|>IouV|)1gZiXӉ4%Ev:`ɃODϟu?PIj\.28| )kMFBSK &wIZp Qy hC"kv2DX5Vh0N&Z2+! ؙ/ z77bM8U,",$J &٠ X\`!m` yK :#6|I 0:w=D&:My:SJ.ρ_>O) C%&Nq!AbiD•CR͐jQF" N)YuJ)%>p`):۶q+q@(HNݲ$|8niܣ׷_ن;G>BZC[v:uY=HrI#&،6yKG(C!˧k2N+"ͥkaǒjiI?4=ΦUr U`k՞U%DISg xaIljS!gX,wv\FKNLh@bukljW#+9w̑ t|4|qB53b5>́oиM,`(l ∐8*zNbUIL:U $Vp.y}v&m fŸ;2IZʨdv2qyBBC$T N]ϓPG%'D '>e!B()$08瞴?8|~=<|p8rTq0eSiclUaD1h¥Ia*K!B"WRpx_p QSb?w"[1~)dR0HL>GD{3$??jR)+*wkpp֒[i|hv[WVauR+Aᅬ䨄FVQb4EȀZ1c.@E1龅K۩6(pa{okdؼITI« 6 _MtWm!)t;Ph^A6Ѱ3ZeBNꪥœRq&X"\_W@5u!-_(Ȥfܧ>??X!7I2!,<1,fHژQi(45;B EN(E/ٻ]*pӋ^2֑VbH@ri0L9:K%TMCCNʍje8Ji)cIhX-0ch nT&ĪWvh)Qֆmެ _sot"b0RX* "<ቕΦiL9bT: h D) Ӛ\ "^bLg%!P1iÉfy$p2=U>-Q%, #̥ bH3[cD X]1̧= dU($ėb"$D2R3Јw4c58ю; 5V4q[$14t곟mdX5ÈSш;eOJ9;lG*x @'jB3U1&N6SND)=qxnLHKLr$D$O &z }Y Ƹx =*c)B?j xø JX:Mnmm: B1ݫ*D!əwwkC.YMC"$b"!,_"!7IAAat;?.z1L~6'hYkuB>n:3wGWQAl^p4̂AC9:'}'侨^K:,ofN O=`fw_&-NkPY#(AcђExzo#3ZIb5J,y@V^: |%_! Vpov 0؂>/Vοo>Ce{ҟ`>9m5+$/mmb[!9cCQ*,4sUJi r 9"Rb\RMezq u2OYvTՁz?fO /FG1{;!laRycy`ʧe&_}i? 4pw`u+ ,TNRJ5X18 6N:a c0+`KՁ*9]~ #h'#¨84S"]a?Z]f|gJYɛ%x?H-[ l~`UX@A4 WM=O7q:6!Xov~ k$[y$"\?K϶ftW2xK e' *Gޮl1J+8Jjz%tŵ i%5'(BEeK!εuEF'ƧuE8dxY\cL2 uf-=Zوa6>eiqZ%7ˁkgm&njQ8 Gvfh\-’WW}9n+S}eUlgn/){seS.ٌ4Hq C<4 ArF׍Fzd\Ʈ̜/677ꖶM0X wt{")_^sݻ0V^DXnljF%ׂ:z]kGrPU~W5Q~i70"z&6b1/!il8$.)#*Ցv+;,^cGoc"X;1Ϩ8qScU qc$HD쯛 Zu 4r&u;&xݎz>JD 1Q9FG(PֺZ#>~xǐ8v"_LjlcܑO"nG OZ1ܛn.YÜĠ}\ \8jOA-gK9a@s6rk0.%¨!קl $D`}BHk;;.!'zMڶ/.3u]]/fm}mr0X_Q["}?'=JlD{'ŸmJFKt9wv?KTs+ l| ٖ-@"uᗽ^u ?oժCJjuH**Mw twO7\616e=?.$0P==RVSpq-,Y|Va5TkW,'unM9oܔGo[3n'ջsf% 7f6~J 29)C{Y+sqUf:)C{Ui ՂjSy0c{iR*QP'beWD:j6]5/!I]{z{TJ40v 4#(x1;S%הGDUSE1vhnjo VMl uRKVgT6ǝK'ehW\;(I +ғ~JrGJB*3i\ FV\ lҕɼJWdvSMI w 9QCM) diA9|L 8wmܲbH _"Q.&ޭ>amמAw.h|!eK #go/7ht|ݻj_lJQH=1zbgpX;^9?,F mEFu49'/ I[r:O1Z{tcJ!NieYrsjBر;Mn5ȁLw\0mH9&HcB&ݎpJ:nC z 8x¼Br̴g  F;ZS8,U,Ě7٪Y4 2R/KY_%X:+tD-VN+Sxn: ιg֗ZP_|$r}} O<(hշܥ\K>vro_d QX76ޓ8+i氢yAH𻿍f.ADdf+N֣?@ 6_wjw?#BͷG7`&E(H0g/YMϣ\WB>si 7]yRz!fX<Q*fl臮PL R6KiWg&~FkQE%.|fi`iUFђ`/rɳs{ld+ʖ/{|[m)Cm-Ucz?oZ-+jFۤ@Q3m4Fgۿ<{LK"X%k0;g`g΄7¥Ҋ`# }Eգ5+\mtGiZvQIoKJnGbV *Gipq+" r<XK8!Jl[%αi-=0? ~a1R%AOJ7nܚ-Muey.4xFCc&HE%>%-+\a2 ˘>ӞLs!Lh*;>@+nȕDHƑ=48փ ) .>[XpCEL3oͪЕӖCQ0/rV NvR[ SsVӍXs&"W6Q9l!δ֭AH-e%#Y& 7*\> $Kta<+:KsBbXvHiؙ*̯iޝJ"O2`[ݸw)@'"LVM*Nʹ[| H8G}viWFwY'>5$G4{1sU(R8^wPM_l{ nvVD0ì崈U832SZ,BS$;δzu.g{ڍ#IMV'RїzL-Tdek\~z<֜|ҽ+0a=&-;iyF ߽ԏeyqDjaA>a.d?sV;K]>' P!ªIwFg /0gq$ u oƕ"hKYWF >t_KB[!uȪ!IY9(,礵ei Pw>ѱѽKq?m ԑ^Vw$NA9n ][NrʤMj2ĶqJqDz|urP$Aڻлez`~ۭgZrҺӴ4i0fa]VX]R>..Ш,%rp7?f6(@ÿu٧p])\} ee'hZ(a& j k #Xi|YI-(e{)er*P]>EF P~xm΄Bᝣŵz4Γ_|EZa+F|1.wQ K*26Ws_]VkN&k'PSh֌0 #ɘt~)%)a t- XłܢhX&tTUPCx/S .=,: G]"j *1E bM@e߷YY*βj?."!4J8M /TiK,hSn͆fĘMkosIۺhH**:yHޛh%E9J!)"ٝfYCK=+¦VidҲozye< tCՏnb3di6^U,5`w) ccD'0 >0!Z2ft]s/_twWO7W04*ͥ~w ߿ 7|\(D^?ݻ>G&7 _hhp<-&_=BM43oÏbܥܸw~P|RK c_^p^Z@HBE[@tE9ܕW{!lbܠm;1uM~ weGLݫڣf2ZWG3ͺnڶ>@ y<Z[ÚGG:e XqInY} pI7_jÇQi0#۫hX5(U0RT0;A>A@\@%BKHcTSFV2D[$@\TP~/4} K}g;Zifvm~]|)/"oqx=s׳G_gdo⤄MJ GH45ոK @߹-4^a ѠX h nyX 'pL_ɻ(B1LNY^})9 n>]GGB8qV%5ӑ"tcT~ -pHSLn1yhI\Rj;)g<` FpwId$i/=j& &Qg5 #/;-(Y-";{ttYj]f(WMӠ5}#1b1pBw 70BʈIWYi1C4eƱ@ [K'\ eѿ@*\c/DZ\jd| [X[U{cW$cpԏ-b13cgs8Yeص̓Ϟ5K<|pY'16 <e@i-;N•tĕK&Zu%zyCyG_bZ||P;Q-_uזx76xܷ;0VU_ 񿬸Uq ai׸|Zhg։m2D:ՠZ=sl]TS"sxGv䔐i|;kmo*7vtZOWlśQrj.IMȎ?g+ǔumRx>?r6dXӇBWTEJc*굿 >tp R$  \вJpQsp΁$ !0IVDE Ty|Pn$&(F50 u\qor!gV]Ul9[TKYd6ȑ Gf ʑY%zIl,9QQ , E OH@Q *-ћ>QK_]Qu_RϯGb1Hp9k;$Zo!2¢@p z)ISi$r=`Q5F#+prƲ(G'5G(v~=J%hQj$K4u c'$S>Z|LۆR 渡:pL06z@VH<' XMc\vJ.(INEIP (X6_gBzց@mGȊUlj1:ݺ5ЃA_rmί+.õ5wLu%a@(! ύrkom7m$vb^1&wO KiRi\SSrqux };NC| %  Q\ 1$gs\i"`Cyv-Q#QnSP"%z2 “@ 9M}{A6LQ &l,_ `RrU=8jԣYJB=a_ Hu?Ri)Tv^;˚\ޯoׯڈŮ}H) t3`f>eu%'f_]O߼LS&9#) oOV`Nłc\`. $a4wHeIK|2^గ|&3>x,,^_%{Z-&~GI;ִŪbX26s1xp%]O0^ SLW8FI;^ㅦU%g_0Lu9٠ݼ aˆsM!Klv]>W [Hu ?RRt2OG;5ȩu~8ʁ`(|GvR}Z ,001l0UM<$|Ұ>8~:$jڎgyAߊN]; SSRJRÞF\i<dˆ+q% 5-pFȂwLC\i/eD 4g~zԭĕ25I;+1ywѦZFwTV7O_êlsfA1BmVv3 rctggvl|~oۅAaV aAaqb0\]y6ڟ~k| ɯiwY:j^Ҟu31aD˃}GuՆ f#[hK R{&.7 /`ܴ\~3wJ(-֟߿;Zήʳς>۾x٦!*WcLq*@VB!p$=E]ҥb^g45=Qɉ7%S–jbRAIF@4$GT%*K GCxYVZk Qg-7u^N Μ+DVfá+4ubKil&: y>w/Km#xY_0e@j%eOcT/<QTIm˴0c8 ?eoCֶljZ=WhVRA/Q=2I&K2(۶&TC+Ô*'1i\W0 xMn}<- ,QǶlEDK L)9`w}NfV' i)?i81uܟ(''&ODDLXxef" _ݟi1@25Zˑ1qzVc8W5ZcԑaoJ#&19/T+ipXTJQ^=!(.Eɜ È2C CbQՎ,  ٻIbGd_e 4#U+$5,JcǝzaEp;05: 5r#*BҊR8*F Oe̍:u\=W AP~ݩlQ) DR驦^KBj4cUѮ NaZWPzk^uZ,Mݘ ug)Ibp|ayUD F`JQx/6')qHrLDzB-@qFȖp5Fë\S@4N%aGF̎bj9֥C@P^B>|x U_T⬸_/n5a5(zs%ֻS{ѦVŞR6d@F@(ed,s7khૅ_~_6p~~=+>|A<,GOKgeɀA $ Q4mB1Cg$V&ɐ1C6ݲ1)^~o x_t"I5`dW2~FW5>qi|nyJ4_ܹ^-Cu{la+]XwTnpŗeIwd9>®f^[W bا M ]1w0VrU9+Y7=_>~پ!b?'aWW!Z-W~XoSܧn[yU7CU9gs4 xq:?@y]6 \tNOpy''#%onmMۧqNH6M3̞B#A X3"XnT%6X_xzh۴BjE2w#=8/W(Ῑy2e˦~nм`-&`R&w<@N0u!s iK ;15Z Of v~ hM7 휏gr/?Z#`{OQU+ #[DWEY)&%'ʝOZ!Miu)8UR0GAkfgIH57 F AO`CRӞIB"90ls؂28azc,&M ΍(wWhg`IqS& |ԏO|)%$GGQ+͸{+\a'%^%k،A$aa.N6ȥLGBxvh [8ӸE}z8|CF'Ie\H:Gt;&1GH-8 IL;g#Wf#řlg#d2DEd|YFj qz1QAg#%.iq;q"r)aԠ65z \}FK$X.rg#M̽0F0r^h BeRl;윍$#g)SJ&#g)֠kHKy^˻6#xH T3Nd@tkSDӕTk~,%:AdR LN: XQ)u:ɲGX׎=['&0FiL#sּ-v:Ȃגj8ZTI;jID*RIu3ujT(EiOeW n`ǥr4OuZ(L =m!T, u0o} a65F1VZSb*S#s 6 VdNX|3EDU]O$xaq8_/o%UEh^zY?\?ZTqNFP40|fr n#sB65k.rM(2+o A@IN{uRp2ВZtEymF1ŢTV@c\3N=X (F8˄y-B&+r9jR]hbW[ek Cu464_tfVFX*5:Rm%> sl/$( _RS) 2"DfZP2͕hY3he$id PQv(W, wk~d 0L XY3|rF 8ȫ3f'_ΞTn5jj8;ADzF[~J`cL69-y_1 :\kj^<~Fq]S( lMab06Xc1A hƟ %zR]uuo3+V ԪTH==Q+~iT;i%d6`U> XuRMF/mPR,RlA|iwh6ik-s ťʴqa޷yylv )Mj,*` $0mo:w[@מ7/ۖud+e`5jص+\ܓ IMeHT)%ݩ6+J_ʁ՝kԒkS !92yr֤ZZR z=pWC dT:5[njHR0Vz3۲y9;ytUO0m)561V~nߗa t[25r22:j<; x@]a}5QV4 WTMGo[7M[*bT'm S1uKXn}hI:qnY74uKA>u;.Bs.FnHև|*)SϻlK@g7ӛ&zAS4ǹڝ,{N4O E 4̜-JY! i ;2nHlʂGj$=OYj9MyGpV&%;摇T~tc%4IDoK1=*y&&31$ܜ<ՙ;NS -T(mR 3=kRY :ô^ ׬蜑-^t(ղK ~,bXjCe\IC %hĩC.j+vC%cKrotd>Vyŏ>C-++4n@7>d{;w7^^˻H"хiۗhdUy2"[&-%ywsL&0bF CE.7 G:4JsdV+Kf &{nD7kMbs&C{U+vQFdqeI6zC v(rf>UhWBBJE/k2j>\.)C* ~]LI&]j!#߻MviO$M [bL_J7(nln;\+f9\"-; -2I{')ԾnX@ u&?M=L 1 4K_ SJ=ZDq!a;5&k,֝эŪW LPű*2zi,J\IjDF}Yڴ jAaЦIQiKZgERĂ1BШzaWZH͊:,# *&\>O՜caްy3Re9z>G炮ޢ&Lo#uc2p 'Pә}75tjUR1Yg?F%.LWD h3> aՓ T(1z0_G{'BW;Z~%sTUǣ|}V hP#DDW!"l!e\]+op./wA/σ^xv+*\(f M٥_~2R܎.$e [ *=6dvbx]C92%BB ד*}}5wwHC`Ura8E8DGdp{dp;gdp;Yhթ\QjjzP’@6I Jc42 ,'Iy̎ljg`g#`xZjP.YI^Ri}?1.]~_p%2x ڍ)zazxdꉓrz(9ߧBкQW[k/ǽ.Ǻd #7hD[~?=iKrRk:3]S FiEb8nR'SR.Ԋ@T!C3 |R?9a cVPr\(~#X]fn y4}o.]5&_y*Y|3{^>itiG /'cSQѢvXұVm7k]D64&*w֓|*SRNOoYP6ʃI}Fv eʨͺ7#[hNq6yۺZZTĨN3XnzPl<&"Һ!_6t*-y]ܩs3bgle9II(xd| IW p v.}]C왬lnM@Z:F`)jڷWA*K^zVfV·&knϩf kV.;EmeVVT6Ӧl[^Cr3BnVfVTҒiGj"ewP۰o<[ @*1տ D'F0;9Ӗ"K'mm%ƀP)"MUSJ%GNq5y,I]IC)mo϶-7iIO{A a猐/c6VMĶV$[uUB5ISe\ H>bL1i l=󳱗Z>۟eʐvȈ2f6e0DaRCy+\em"m#F뼳K@^K4ԑ4O\T' dBivB(~ uy{&~XrBAA!/Lqyt}Ш/?IgZ_[cY]E%Cq3Vsd%dBWu2 -:j.g7t?^e֮mYN3 eG_*xAiЬB Єe6xprtMmsGDp|CH (Z =}2 psѪl^{c^k}yUf+ CZgge ceӵy74Wmud3SXWncl*@~DS5x21ݮ}:=͝ӻ/nSXWnlʇgϼ[@^Xon\Etn16XwBr=ؒ 31)L%R7H ̪2wp)zpvP=CMziO3t~vv,\L1e l]{Ǽ .ps3IltO; y|.ӇtԌ*>^N)cZ,F-/Wk*e!8\uY<eif@1Gdyh+ c2}r3wU{8~r8/-b ;.=x)< 'fPZl^fpOLN=샃TE-tq誨n;8[Ͷ2^H?^3ӛǷ?nH0=JW8(#gbQÃϽ~Bd5鈁5Lfn57 )Sk ;ޘ,jiMу,XH6T1ZbM)M5GӤCj\dzreʑvY8"Jse=KO : 2j)KSնn"WTv;&,FjIL\Y-${hqg w,r/u|.!n^,zi?> sAYvØS?:GtT1mpֵq&\z Q}1Ƿ~ng_~OINw3ǎn[q8m"Mج2^ɯҢ`'l1va56M^W/)p \@nl | ( /j%$+>j`Z+{)K8̳u O s=]HsKv۹vYlsF[G" u8-A13%EelC- EЛ3!ެ"ϴF7w}' ~4k'B_N&v3ma'U^ѭ"sLأ}-[9rwWA9sm[E#tJ |Xd3:ԢbM"Y(ay*<%wa[;,ؔ il*l'ge6ڝQځKlTS[L4EuCY࡮BIAݰKDx2!+Е[CܿNgZi {0k8s[2wW;췛}2g9tiGͶ{$JezkW@2tۡɳ#4::ZE&&ώ*i0dT؟*|L ]wU훲x纽 IJx> 31L+mi@"gZi%#h^%P#h^eX[9r^w`*"(geQl o;r^w{0S325ffey:_ffejdG,GfE&߳پ8F\kԖ]'1B U,>/X+=f\)`/&M L./)+ kS!؍zY);nDx+}~fYZ*X-iĬ3l+C9G!G'_jh,Y<7 ҍ5Xi6dWݍhsl2JUIWƦRB]U>I,u҄XǦIVX:?]A@UnCS͎߼g Pq_`WxxPxZfo՛J5u߼z]><ǁ`C`}#c#ͫwiχo]^z_+ bvyO;"1KISMG;oOˆ51Gϻ3}Ep\xSig wNbZxIi bWȞRpcK#p!f3b>0b#Fù{0bҨt89#Ƣ#foq&_c 1ԭ/J`b?v1,1vf ޭ:fأuK #UoFRK* ٥\ 1562J^dVo!s#X4n_ocTmbElFR}JO}<$~j)Ăp:16g0o`;߿.cj P]|Q2F(?~~{-8 x3gǏQs2#"Ycd̓ l&Nl]&WH e,jLuj \Ƨ@ '|*4YcQzCӌPE A ) Sn"օ4ҩ*/gacԚO#Ӽ]uڭ&. Ts0A&|R]T$ĺhUt%'u}y9odSi6t1Z\%uENתg3;A!<6Z V{屐 v.h•Rw/FY"rKbFmޡ7'nq 3~u @vKS PX5Bg&A"I 55iS,} 8@țPoab6sB`cŹ'N01>춓<2M̱y{^fEnOͫ?( C-dh?f?Q F$ޗ=N=Eմ\- I|$h֌"̰Gd~L+McBv{5mgYDz3mCظݴ糙uY^qIlҡ4In/; ױ}ˎw[]2kwlei3 t^4c z UN5r 4low5D4uLQ\׭d/;EKf.}0nk˨NQ1[-)B\ogVj~[ϺGTC*]/Ԃ~q?^jm}t?}g\[EiS-'u7['baWjd"buŁbM"2yVJݩ@V 2QMUr|-+Zf6UU(e8n0?ppY6y+GB Vvq)8dg*oh:o:.#r9|N9q,CÎLcD`"BMI͌vq(FU][o$+$=K>Kr`Nc$UM~-mH#/Zt dmXUXU4hK‚]dar݅f(`VYs`=rWHov;QaZأ]6xfv>է;TKN\.:ʝ9'߶V>P᳷sp dYdFt#Ƥ f~#d@IfI `kRj>\([(I"qjDlP2 A'p|x5aAe2U,1Py@XQ5Cg:Ot%8\_tjӆpp)PMEi!˶Jo{d; s~pC!TsvizP6=Jlvmƒ|o9ijpYπK$ToawwRWR#H6z_X_ PG/O/op^}!Y7i`=rߢ.յK3t]Oޝl}g:M'c!o?ኑC2޼F{5*nkP^' }O$dZ]`30`@J;6. NiM|ļ%n&Fw֖ +yH4WGd75/s}5FF/2:LF 6ΈE6Fg 7dfdkMzy0!Ds2 \,mzovbd"_+A 9̑c/E+AZce?řgFv:gFɑƲXN󗢙bd~'^u9䓍eG$85F^=QPkӂ8_)\)}!嚆^(gICh5hHz'67TeQg,+TeQ'\27aY=Y`Sބj F:ccәz6Bԣ*'jJlm[b:7ڠZ'cMdGBJG$j%b}މLNT\SdtDbOM"-9 lh--FǦ|=@  xs3Fc%Br?`U/šBxOCT'@9m'7Ǟ0(!rvTMZnZ*6=2\4xNatBJE߯ eI(ex҇PXbC f&+v^ Z/9_y6Ӊ阐'K{'wr蓿5Y\hrJFH2Z@qH1ɩNl]>L޲OzX<4l8>WT.H74UL%c6 \EpJrSݖPJNsKE4 *E@jHnGvSK+ b9nd1,c YSIF{G7%bTu&JEg'噼kTY\$akRVa8%w`w?{{Gd,nˏ]${u,׭e#׽űO5EyUSϿT ׊a;JNVH;mŶV?9TJ8$p.I\ϓ1HIz'Jn]>uե3'͉I1 &X:22kB Uk vb,zWkq1]d.J%7 fP2էTk.r&Sr<=Rᗩ>ݥZ LW8658r#vዐT18d*: g\EO n*mh'@!+Կ|B4ZB4eaA60KJqŞo: QVa=0l.2:LFh=r"dc(CP'$$V?gy]%V3n u8hN 2*p.x礡 JT֧q; sYؙK/e)YF4ĕJJ= l 22B*7;@T}ٹG<|mj0Hiϙ KIY[M݄; C+9E MH46E Yd RIjn.#(VΐjWs~~J]cJD ?k-"Kl{?B.X}ѝ'һ߯zaUSŎ>$677hl6Q2ݧ`/6<Ym?QYW(ewɜP]?`v#{jm:\?<0ɹe/5L1]PLDXɊ^I77/6O9B4Bi`k07ջ1QQKYz(2"~mty''13Δj^e&rM5n> h=@*JO_ęgqTQϡ*eOPmZN2~ T]qCYDe6Tk-9*KuUԣ@`ӄ2-Cr Oj [n&yccbϼyTEMM<xc"%dg 㙞78QY;Kc 9A$1DUT" Ɣ̅&afM"䦢(lD&6 VT`-pRؔqg ǟ; )"PR4.ՙYr*2C/5DQUfju,uUfnCQGFTy|hE-nrܕɚ,nXnrgY#mTfN.hUσǒDlʸڜ_v)rjHjp &d\:'O꾕.@Cn=/OTD)#Bˀ&M>rbhJvgM+Z6[y'6kJyb 'Eji~j(t+r֖#uPJ ~x>-b ?sL*lPb4rԋZIȎ,D,ĚrZB} -:r-|xТ+ 4ARPj+>)"1VBe-cki}$`KJÃGXW.7LƸ蠗3 (mit@\ȁhLR\@ $zIuN&X*qC/^Zj„5KDZGbMIXfEj))Yek,)YvC,426e4&7y`S^FCM lԩVTuNwqj&#$  S*5 6MMyY27QfY2*3Kڴg.46cnuI46օv*FufcZ*ZqwvT_]\> ~2>aj8ndG v$ZJ Ȅ$'+pB,SA3;lf&nm⧍okR0ZׅW /U+Nb7^mV·i7&j0E{m؋l "{-rlPhgOϏg<~9_)aIi#ϼ'Jѫ&Di 4_pxMgI?ϒ~%PJGOi=?VRxO~G[{UCWu 鰹MB5TT\m>ݤwf*4*yn6ufEC!.#tz$StMq;0ufܚ)(yiqOln.Rum(?c],(cU+8|  T{Nǝpq1%npf˛<}8 mi bpO2xCyۜ?kގI(!:p^;ߜzH"`c.bp|m 6\NTY- I*M@ :.׭Ra‰wzq]*/q4IoіP"PK*\"tߟo~R/@.YPo)fzy& ‹U o}swF$W;k#2ǿGFZ j' ' {sSؚ&$(K}z- a f8Y]7 ߕB̚1sR|҆ϾTh 1OgPdQ]HIX},aY6V/h k4@Ph!`Ad*șȃţE#EERH 4FQ1Ε}}=1@ )YChUѣs7WKc0`C<yXi9[(ҧ@ԡԸ0z,0|Ȯ mbk:ԛtH22{[{Z{ 23rpDNֈ؁EBHUg2 9TH!S5Ladzk)'i q:!ci >SRJKNiSVOdtQ42 KKBIw38y?PLLCGws]< S=-؛@+Ec)gN籔4$W=`I"}wae=[|[SFJVڽqng{ ^ͧw/k-!X=J1{W;TkJƜFOmy@3v]r[OBX_?Gu άjqm.q+`Ǜjt7/rq{?Ξ'rc_|^^ڇGzXNGzjrtmfl2Ox*<ٍ+W׎$+'Fk7oDNAb#:ߑFQ}}xq&"Fz$L> }F۞ز[zF`J%`)O9Y,Qz}M gaJQ At {+pq*-kPli^ H4Lz@Ѕclμ1/  pQ F-W0T69G[_9&_PŨ#1]vEr` әh$(6KSaijg}țשx3qw_~bJ6m3x;_&cܼD@cK|bw5YݜZO1*VJ`JC613 m/'A}^ՑBǽJ*QQ4JT^Ať2Sc %J 5teI J5&r0!(92I Anb-7fdo ޓ ɣ7y]'0 ?{_ n SÁI©7NܴC//Α#=p~/jbWC x3i Y 5ZkRdJ!WdQ.= `+DbzВK΢pByS5g Wj5-G8Q+ X&n(Jf@H(akIUIdhu%&X#L8T^0 1s(f3 iJ5@.!2䪑=,Q`M B51ޓ>M%O ;}ljH PdM iY⧆Odt45$e4_LLlAt1%ݸJaːfP M ik,tuKOW)G~籘]"oZ^N9D, %㞴vE6F0Z(uh}t҂(#ljrBUL d#Zhr"EaiIQУ-ԁҢ*Te+ʊte{apzyrM^EEP)RZ8TԂW6 7 g.YTp".Ք*B#\ b~GJ#N&S@1X#|(uhL IDaD햋A~GvT9v^hvBBr͒)6}sOm1eb5]+;En ׁ|"#S.o/(\\]y㏪+ͪ%uq3[ɚ_==>c/.6dwzyO|`޽x@x dX(R sFk+t }kb75;uABivV9dZ"89Cqjj-\6aLuL SDIST_QK[7!BotS7!QMV( #e.nձnBjM_&gÙrshd`Q"oj=8{VGGc0 W{TFI٨ߨ㑗aVjLX7˰jpxeW].&xeXM0gԳn䒐[xPĎ^rśqy;7sa5Vs*sxګ]j˙@_왖/a%YrbFg2K(1X~ X"dXrpp:S),5fO[uI,JEHڜ&~oݍ5tQoBY'1]70n'|[]/ެ:}],2X݈7A'}@O[bP̫+ظsqkpn.h`rz_ Ĝ͵dC0WPv|SklŊVA(@>L3H <̛V"Wnh- !J6)hݽw"$+O]G+ǛX~X5˦ɝp$#1['fJ+J+KJe(Ж%PNXpɊUciQVӑFXw:N(4.uXqXRa҅VB9BVx' CJ}R2KCvl8wEXhCCYAn8:%t r@~^gCO4 $lNzdL^ID 2cEFrHJ#(q˞ ƨ\ɴIF `2W,IBTkNQHJӴex_r8mEab~_.`UbWTG8eNl.ΊL}";FU*+`T0C)k72ZJRU *y `Iqe@0%WUi 4Z 'vcF!ݍhP{ I&݈% Hq J[j0mi[BUh+J@'@hwX;z#|k]h |~}x-;8VygIi TOUoa`[+<%k*ֿ`}{{M~ƛ~5G~xo}ƫ il>="uNxX~VT'ާ*z}~UB_^n`WXءyN mn F?8:?|'x( F]b(z$6adeMܦWۄ?_lJ"|]oO#^ hCZǵhHqNapIwD.#zficvF+#fN@SW{T3 &$&Bf.-\í7nJ2A;_ e0[œQ!O{Y)B.7ȇ Opw=ybo&ZzXW1{̤pAp]kAa2:Qt49rK;c;4詟)!|/!߽‘F!|X6 JUe +aa80:#C-sYߞjI[RA5#T}Y@ q%N&u˸un[km'w.Ղ4:Kx6Dvr՟~5n}#?[{PFLK:avwugv$7r422 q)`vjyX`1y5 ]BvcڮvUb}BRV[fn /*`2<Çwe cGL&A]Oo>^41p//SW1%zz^Ws |\-c ;L^'9վ ϝl/f'/DGGSAwFw^rm|лMa!_)8ON|E6Vv]J/7hϦ]7KKRv~L3ژOO b-¸b-ykS݋ڭÍ{G[3ŭ5S'hEV:{~Kg`jZٻ7/to>?N[=ݽ{z??C( E{?,jju=g5M 2lׅ0"׬X\&Ԟo5 ֨-l*RKRߟH>զR,[0R{[6q Ҙ$Yi<8̸*\el]ne=Δ[i܄,ǮoR]_jv}K .>)Qܯk<Xc(ώjO{\>qYHvRAmp0BHlatZKsiy)3) >Wsu)7N3ӂS2M\{4͒g!]ѢVf)&u drع&_yb'ehrD3L\noj uutΠa2v")",qG&>ZGW_ꭱs?)sBsMg1^/#2.chb2+hT} ,) I`MZ"GQ2j\vח~58Fd2}Q9@ SSޝ"!V%@uE=EIlYNQ2%ӋbLBQ4?q*Er8 EI9EG@14J!HUHsoAZ4hV%fфe6:k]vOS' U?+mTG=x3ڄK=選+kKmP±y#8| >!U+WXJFjfcu +!\SXncl#Nx7Rz+z:ox 8w_-nSXncl ٬#Onέ"wo$,75/N2QLN2|&:ئ[jbV ٩%V ^e;e׺3v Z\{#uqk -Mլ -mfb T2BTb[\E%R;Cj~htUDKRv o#^"(^z$X20+#n="0+k [9&lKWA/I}ߗiP[&~ͭ":.o£GnK‹Fjm"T)YF ىءX=f_gv.~nX]]/N7Z/S+%߀KE NE\Iݹ+\pR)1z5+5)pu25F2sQ AprDS šͿ}SOQjH¶?CM.o+@+j.LJh@پxc?F#h c\HOٮ|껝ye;7sh9׶%b5VQO`mDHO9Oh[[Uim0;k;p n3vw/xf@WAvAhf,DѠ9xߩ casSCeס~\a[O`KIb!f߼'4˵࿇.WMީ77l0Ob]Cg_>> ۇ]*?NF,")(`,d> .N~z<mfN#S:5m*)6?*2*ʷ {4 N?-4^gI j:G X+zPd&SրZ|Y]6a#uv!fApvS@Ey"vB;jAUDPvHx]2v +»Fj~<6GPyf:s%Y5gʺUDPe4R{[]֒TX<#GP2BjAo:0&cS\yԎA玠,<WGSFj$ܠ\,ly6B^Tì+Ci_@9Քv1bVV:d,aȬ3rz7QX{Y?לvΗK`(٨k R>q@l YE-FjzAԴnJM7i"<T /U_/.A0+k+KcYY#u|v|T{ C& Էu+~3GPڪ B+aE#( :"( FjiAI=( =_#uAg$vZgVA>zL7sԾ+;*Bi?<.PqVJSPWz@Q߭3ڿ8&}y4Q2#wO\'!ܨzFjZێ22B@9 O qx S{DY$q4wRL/f+Pb;Rw ՅUb(B(,:Ant2Um eQʘ&,;qy *崚=}5$a1eq-F;o㑐EZ\IOm-R<tFWv3-?u^ Ȯ(Qp% PZ]"*MXg#$+ul'[ֻ9*%d㪰Cq,]cl5P٠ST7>I tfU!@%)fhiffܰlt뗯l@(0rI/HlRrbˬC*fyU&_ -Yu\:S!s+JWb /!aO~ Qq6=EE \#+iRi iRe@ ʖK}ߗԆP;suLnmO1rc\JU| 1,2A)U1>YMyAW .F3ƋXCȤWi:v2\0ɦ49cq>Ui1 `j~]5{"kd5G)K W:>G+e$^L%_yuVEWnmGhKƇ5px{k'?I':0Pg2x #gNbFdRXU7I9+Q>}a)W+$^I3jj\x?9ע%Ë3sQpę 0IJ98FnrfT>´ۙkqrH 02j9P`XYMᕳօraWXGqe9g9$oxË^ԲaYB^ʼnLԞx<2汃qhsϑfAuJf+opqnwC2.zSwCH7-+NUĢ$upTӤfUYmekKE[,%!HAIXG2UxR@2o.Mic`C7 KUd$Xj6jC팓ׅ cϫ2mg{;pݽK:0Hmykq4߮eg, `PRaT^DJA}:zFWBU a1&ˏ;%BѩkAWrYm'03}%#P-SnR6V\Dj@M9BCQV"iʉ;@iC) _z煮Tt \X"wE,0Q SֽU) H`)=LiQSnaJЎ, ,)l<_M%IP4=*l%RHZ6#FmׁK&KH9\dd6J7zW9 "zGcFjg7ߴo!<7yj#?7!in e͎t<3B<i}G a^NU;WFwF/?]y1ݭ+^"hvn;Y.H)rn%ר-Okfj=u *_ wEUqWvw]QeU3$A 熠9k.Ѱ92wyз47 ۈmZ?8عU_|A߼Kbv}#ISwo2nf>hû?5)Ro=1A(&DbhM/zo rF(J**VXa) Z;U^#pU8.0B=2(J␅4٨?X檠,ʸ]YWD re1QŒƓ ^lSޣFkžxZkOeoyOF۸=oi#m][sF+,lmmH}UXڜcWݼ kҡ$N*B4am 6===_+B+yRT8^1B^t#fxqzIt7Bn;[W d}1(D@dc;y!!'Epͅ?] Ɔ16P؜Ncki)h 1vR1VbR\)#v<;K"^o[sל9P̥%yRȨAc V .zb׈dl[D2uvZĎj%l0P)˛py/Pl QW(ՠ%z %`I0"qa6S&ҦtMe(]ʡz@J]w<0iaBQ&(Zi=Ż!}@?G8}O#^#uL;z*nK7mG2Cj1N0;ʨ.{'&+4oc!1RnbX҄HX%M4Hh]7 G} 8:6ˡsƀ2<Uܫ6yԾ=(|SX= #!qG jPda7Uu^ W@`)_P9o7g;Q(|̿3|Ӏ&.J&KuV~yr1^Pf|fF9}Nl2ww `{w]vsݜl^?o6gq3Mtl.&>LyQ6v{3p2oүmJścS| $2;? 0{\^m%4s2j*PȐQ#jZ}"pÙ#2R1pC$uj5 $tM'մQ7|#BŤM2pÙQ͙ Mx3r2`Ca?bmf@xZT%|0j^L VSX nx5%Twiw 4(߂ 9;>W]E3A\ ,X-ba`0 \y`. ?m5nb (,3s9,N汍$&Dh"~R!ҵ5Oj킄r0IGtDtH70;y*%?zS"<"`7 ޽b!T,:O"k(H"F $cb8e SPfhd8{*+H.HFPz'T멒s% #dUUAԢ; J4e P\ ݏ516T]F^K Kg@!2P%~D1U"ĉKdS"BS9Ts!#KORa@ z8Y*'dZ #K X K۹&=AuL)zz MXoj Q5wC5Q"]2tøo(56fIMLQR;D $C %YPjjP6'r6j1a3BQ ~J- {SȼB];=y'Tk> {B=-晏HeMTcc&SMd}C9P> heT#rߐqq_Mb+ߊJmE6GF RԁoW]5LH]/zt:w)4Ї@;$0顉$v'^0UBgkRJFIA)B5 Iޟ+jp%f=ZzI>?.v32;C1\,ک&20w7Ko4/i^MxA^#1Ƅc,j,4BFl1i[F($e#1SlroY0D h]{I}_67gϦ -L[b'WR/UBts?T/_k!0䉢!ᩑaZ$HD(I6eQ,HD=6$&[K62^ vV\OB/aꍈ1vRT(PDPRĢf Z)e< 'A=c`[ąR&@Z3~ةb 8@ V":):4V]o)5,Ԗ?|݆i79|ůIMbW]LSҥ@ )h#FIO:Q%):uUqߍzӉnѲ^)cP]y2f५oHNRm4`n" *SV눢h_БI $#-Xy"-m4IsRPs <uf[)}݌g.Om V&s`\s/ŻHO?G!$ǁ}6ُ~Z^+׷'89R77Oߞr})4a ;q Qz{v}7~B"dvm+zS^DljG|>徥-M!{glpoVߧ{'yRI$#5ϻևQ>uåO.}pYN"uB2GbʤViB"3qRa*xmVm?7(i tA$TdIOA֤ƺx9*K ^'*c ҉Dq3.T&*5$aulDgZʶol~"%^ylZ_[JIUG;[6![Qb]BbcTOoFSICZв,Ǵ"(lJ>;W@_V,u6'y"[([ r/R}jل 7 \.uhS|AFVdutpLpM%=shרۈk՛ȏp6FKҟ>|A8q.~2ͯ#Ƭxk\;ˣp`7.fX߯~~1p\?;7-_᳾5mնԖ?ٱ)MGIg]D݃jOs|JeP nC[qnVMkjC6O>֒8j٧p3hF؛0*V:.9#_x_^ |M-WrXWZs׊9_]OAYgâxwEpT_oOj0V6߹Hɭ1<ԷW&gIsgosŃtY]guvsAi99s'Dӫe|Z|4 ŷf&5/ ,zTGYnWMt-$Yz9㚺 (*CsAt%̭pwbt@zB ft滛*gqqc&1"}'ɘ jJ)0Ϝrgu揭Y!PjR#y'T4c;uHK!nޮ6k!>y2kଶDOk0g3 .ͧf`yߜn,y*R@K{*sԬ<&zcQFsKKpB .MOĦ(M=Z4]C `Fx8'\9;8yBy}@6"G6gxŌ7zp{4 9!H:Ѓo G+xB5NX6E(g܍shX&9z߲8f]aRs'7۫4<Yqc >UnbCo<3e6Dwd࿠_NaV`Pvy>\wƷUl>{#|ap@6dm0DӗyNWzfzƖ8i5U*Ȫ_eEe_Khšݸăݚ Du#fw;{2ݭySݭ UN)$on6߭! cwDn v5껵!߸nh}nh-PZ{b&->#,ELN$8 at|WnlB9 %)"CKzOk,"50CT_֩V5uFGwRv.՗uh,ԊR\Mhzt: 32Qzhi]J%Պhh eS#_ExgwK̫4=.q*./Vu7|}=qzzn j5a|s_z|x&r?ZI^>d4f&O͗OsV "ShKݛ/o0YT 3{a/x T|\<2`G2NͳBa9<)TıTsEwBoyMX,% C48-HƸbiV #9Dժg3no<iԷ:9 ,fgbTjR,#K\Iq.yJ)TKz@B<ItBXȝ!cI c:AXf4&c֯/ySB0߾>*5͂޺5Դ?x9QIlE#b=1Vki_"lv<+rd 5⾨t^NQyb&o܈t}- Z5mI(ߗpTFH&Lz(:<ף6)$%}H6CT`g_uFRWZA]JrC*B)&=TlWtΰM1püjQhy^ ZÍg|'>?|hxeY:` D.'?I< ! Δt`^vm{gĢ"(4 E+(ĝ`labP(_|+ {hcyA;ǸVFd3z2 c<>| y}D9ж00k,*z;mC#pyXn@Az |̔ i7fC:[,VEӧYY_[Ͽ.*'oǦ$[ٍA?>rsss|v#Tgil/`>;OEz0ؓ__f`1-E"F:,.֢`^>{1B(ƤSw\#t#GuS4u"m xRbGtd!ʘ6wgn8Fܔ5:?x{X0u5ob&(AƓ`iDi_5..!c|0.czpOE1UNn˧{n2Z'Q`[%NE!ndULhpu.A2D]uXcrj՘iZ1(em~~~7Rm_L?LմhVzEJ|:S5N8jЇ;bqQN@C/k@|: SuxYJ36iʃ.$M(|as(QNh#÷ۻt[SNwnLmwk!߸6)yj=m Du#fw;gAt@ݭyeֆ|*Z)SdRZ0p#B%ʀ@a/|,t&:Etl7$P 7GE &Zc:M=M4hl1cX|XRdf,N S ~nR4/SVXQS>Ljq0Ue>zy\7\)HKd2fS6RL` !RYA$y8/8@؆?mjm n66rs!fJ;.2rlTF eq) Q5ݼCe'qWv+׈:,5';y IM2"WJsߗQ*n_ 9T'H{#*2F'q3lEȅ(hs6)",yrGE.&m8'66o7l)Q"7ZtR|Tk4Bv}Fus4%8@:P 5CZh,ɲ9e2L aтl w>V!#sZPT+F ڵ&‚̧jhA{$?7e[}gwnJW$_xφ~mVN-)nCЧv?Ú4n$B*Uˮ>B!S(S %"9)9V$<"is RBΆJWn0Pďk[IǏqZ`j_T_&o=~yXy察D=}( 3ūoINLN. b?|"M̽ݧD̯M L#h{}kƮvWhPC.Dm9'q(Gu{_GT2)D/0AD}8:{\"YA t&㰗OL;PVpأ$XbAlI@)Bs5x.{qLD$Q; eV×xϋxvM& 9 SFE{"a${0[toKz""`Qd!L?LZ1W\q26Xje'J<]2JT.u~'d<$n*@:F'ݮ w?gW>Q_8~ U"jKqPjq7Ds]A5' '뤖o0}WQ)z#<0!fX=OiyQf>ϿgXFy,a.SIJ,I˥ӖPd;OxW9 a I(*%Lx zxo5!Q_IȆiS"ꈔvQFɀ*D7)A FdxE fLLl@m S'@Cj Ֆk\0&2L9AьH+ \c!2q)axa+R! E#.2PK  Bǣ~uJM1\ү|ķځIg9.RqBӎ2pF$/i,ߺ`•>%7K}s,*}9g 4_Uk1[+_upN0^sNL[<亾_ 3s{eVK"3l.hB4U\GlERZ MB MZ @sq`!دYx$=tq؜PGgZb⵬ݞ莚\vNF"XӎOGM?"̩Fk7\1\#)PRηH[fzN]ʃXSQeEdYgUNc+|=Ng/_ !.⣟f>}xsA E$B?/ky_- v}Ob2<}BM8 &/&7bú,pr*03RN)0+ wf 姵k8O%T+"w+]kBa?ןAaxUP / :.#%9I?ARgoпL~xR0jxm[hRE? .nf}1@=>,Ϗ+o민|vጤKr,KLr(SYcV#?|>Yѿa78 E'f Z+YY7U!Lo g"hTuUy qB ;9$h')ƕ\X5`FIJ}%GshK0"ôZ!6:D];DzD(gfqc0,IsbI9V kc D 6c%pjrXĄ6H8Ba!~-E![vĘz"ꄠ\W_ΰS )"P\ e^L#̉"p.ţR7JT!tDY=,9Xx+gܢZJ9&fA},+d _SPWe4LKZ0Q}YZ12],p+;E 9-( aтϨ RĠfFB{Q#Y +kv[)Y=ELh5'w ã✰p-eJK{S'j/)PQy1zHJTr )PaNyQMXNg ˮ!BO^D5h zxhCxOS<'!Uc:3fTw^{~ ^[>ݻ&%%iW<0m@-#!d FLj)+ q` պ3 brT%߯6d,1-`2F>O?&5P`JČx֩($Sf˅рCFbRt9#3FXE 3N<A sanbs!EW 5`gxz} yyVDiS9(ҠSO_ƕ~(}<#YB/{r-f A, 6}bWD$%^俟D ejfdX/3UUץK醥?}u va!U۷U #TleǒCۣPZ Ą"N,W@jC1`jZU-Cyp?E-~i4 {wnzZֵ O?n'Z=$^~qOuoٻ$E9>ksщj%3ky,G+n:v{WI+|xM<׍K6ɖ#[kK47pz4GVSf*TZhLJg*< s)Re$P<0AxE[=1)+MsLBOnOQm(S{jPX&!u7է6_R![T}wS}zjØ}}r@Y1r>έDΧguWn(w(l7G9NVL' xG'ѐhr~+k13.fcNA*Nz~<2aL֯0=4d+t(C=\|p@9w<щ0rX4'~+>fт tL I'SO"PnmQN_F8Eƽs QYĄK"Ra8ŇŻsbwh5bK ӝqF~ϸֈXIv#=916D³sm6D[Bnj[_)4AyQBۥ{ i~z5)9M%ڧ8r Ac(aS MLR`"ue IM42.9YQ -ww*FA W ]U׷: hx#BNJBN)!96h `2FldK$7bC`Z &Oߔ 'L D+VFEQI\ OAEFzd LXxpXGL{ڢpAˠl0)z@YfE<ĸDK\rPl@\Ks %Q%M!1  %ioQ1*A`VW\ B6"-/gd w5)b|-_OXWdiFdTtSL}k43U"%]=dBׅI{~r^9~̓wng]%\ 9dGhr%Y2Pj)L`l:ha[btRm_"^0 e՜ {wǥ<+.L;p͓u7舉5;׃Ĭo+ ڮh )euwB%2㘎ӽ?rݲB2gBvsVOmV1 uG^z6*ip`g:m G bA!Gǝ`"kݤ*tl <Ks0ٲp@K?dz-x>{޲P]]{UK=RE>5ZQڂ}K̭]N#^w`Q[vU}){b>rPU9*D;ZQmy{}2-%YE܏_(^IA*[Vo,ILqFrRQpYox|S|{k~L\SďrI`Fsh3Zk \%'LIfg\wr@Nʡ#] -pl@W6.ϻO-y'9. ìO Ϛ z9wa{hy* .`!2 Q\E~[q<$P0I F&AC{J ƀU.`&Kߥ$ l0#\Kxo Mr˝ؕQfJ>׹2!'bCƭ}aW@%`K&J(EM98 C>rV9;7(,Wx3?Nw 2{bV9LP3ZDb%Lr"Sm_-,e0ϧqحYT,Ph>&\ P<ǘ^^ámأW A~ 0ˌ9L@oTMcŚŃq8{xWegJ|5314yКColݿM!<;{{]9J`h@7&3ePgOѾ &%I>v?B!0˕H'X#Z*LЖ˩U'N6Jf`Bs)'2D@&A 0. *`qdz uENj~YL<+'T0 u%mKAN]mf{ {^>>#&M Qf^n.ҝ~lDcx)D.@^!;hž8aDωmK_9=<.b!c(Quԯ13x\Dz;Y? sf}F_kBM*#hA`p@èܝxUĶ`a wNO*~M0 6&z > vg.2yCɻfQzqP/[) u1@3Ԁ=ۋe!pK7ji0ؗ  }!&,ؿ{n|~X6ޫ"rth h::@KϓAEhz3uhY,];|PJnWeclyvWvw$&=MF蕢Pؗiz2`SpΌJKf68<{N N( Aha`Xŀ(}ߡWyzo.gpaD9R\}Ǩ`Q~==U/Θϼ[裍hX s%L{lp,Quo_N,QZQ-}){օt<(i`ptTjC _ /2GZZj^J/-2DƤ/AeA*5Պ1A-Փ<6ك/ =l:Wo`c ٸɕ+,R#&x_LМc̥*$񻐃:y[Ӫ=Xc4L ǚw̐%6JLfoU e-I<)倻Xv1h|z-ǤzH9.m2O?lA-ǪA=LRa0r1ׯD^G唄"eb%/K/7ǑCqjZ avY+,]4ez_) wc/`BPډ(\\!]-nu?SZܢ5< <-GP`tTAWQrut!My)L_g}@fv^8n6<4^){0Dta"L5tfj'ʎ\S|WD^`<ʄMR0@rH1JHep6pN#0%Ϲϣ<-{yVzԣ|*8p7էPm< }OBQRK᳊6X*}P1:uLڭi/gصO r*Pf( d{6+\?RX lXo8ҁ6ܧsd~<<O>+G8/񓏗 ވ-1Ziou@m >ƛNaoG}ْ9D؍!!8.=ޏvfoXzk5kQ˕kWŝ_ :F|x=(_?v[+ӿv  }.uџ!x_CZ1-jeDW*Vl4Կ(%6 5n(2@W\2'7^ͽ𧻓i ԉPؖTغx5k&UZ]KA%}23z)ST҂ 1(/'VƕM㭌7̧oFoD4 4z[䲖iSt' imn#mdkmHnT/DEp6YcxL$Ŏo % )JТErꫮ{W'XoOӫ02TzQ‚s}[p* ҉8!yيjȱ$x92\Qߍぉ1 2_`2Hl8!:i<5&E|Y-pRr!wHjUhFE)H!8\cl#AGSnͨi}0{&mx?g2 CUSx*b{lQooF ]nfgޞ<\ZL*>T1dVɬYj|K*YAIe:8U#럿|R:QKef=]/gx'E,k ,4d2⪮f'v6Z}s%oIukfS}qj!<ӓgLJgkv'~ITLUqa/ieZnr* }J؋m~*| $I$ {}A0N-Uڂ&sQǘ8]hkgiNϐDD@ߦˌ$h4 r!%Rrm1"s jL4ƻ%—=SF !A030 @mG)}MJH5耋ɓU+vU)e[كi,xmCji~m#~QB4Zr\XkfS}qjCձ\r! DC6]H2=F:DžveJ)+{FzeJiIfJ]'tySgΠ(--(:e ۶qkg6nTk\XhAтL)jE:aAET7҂ҟ`};i5,/rhq6׃ڭ  & #NIV j,@\G=ySeډDqinkgH|`> 4P"K"tjTP]X( ~ҧ#fmqIFnn .n. iE%&,35@!1C㟟j5ş"(9? !, \0!DҔ-!T(~([ .IIWp}mJe'ywOPm<&k$GIى_#9bcRJ5O`V># rJMG߬lShBSE; -mI4ʭUT :ZsnP&yϝ,Fbb)gp"c@i6Rj tXɛzWO ŀz,UAP:a8ˈсܼ8|=kt{K9>Y@$R͂@ b3OpRyxt07*!$qvy$n@[~d3t䚫-k&`))\f؈TWT OӘ3&kcᙊ¯P-2c ]|u ǰsaZ&ʡeJхJT_ܣZQUL5¨ǩXZcq}ȩ 9.O)Rk;xݽh9iӏ_4D/em!VSKLgxj.]ڿ;1J@g&a"Ȫ'.ݟ-&[!oj^auٕМ^I QQ$NwP!kk26̈́KnT>Z*R֖rАq2cUY7,g4FA`aT0qV׊0{"0b+ _BJl@ vG J7c"'p;1AV C5j29OrY;>@2{K59=c Z_8>8Tk\~9?ĞFR48mLJe8i5d,Ve',h#M|ΝHN6OsFEF+b<^)2+O)3 eR.x< x^-pZZP*:abjx6?j'|<3YuWYLTs*͞3娼H1 uw|mAEfWY\5ZnEfYTs* W+ g\`Er-%l:??ƋNdXR8{SMGԞ|lqBCAn$S'?S{N-⮗OHD1'V3X\,ߩG(+dD"eD8(L9eM}gطa8W;ïupRi@~q޳rtGgZ;~ w/q>+t@*Fd'nܛQpo.fy;?.F.VڦT=(A2ki,'D"Ͻى3 kI{OG?Y*Bt=(=f_\"9}Ziضj¿.ݶoSW9VGػWؚh9W=z_p|`@a{0Ӳ~~};<-Ix3Z\j.}J>+ -;o4mY8>i c8QFâȒ=/JS,謗Ŭ10_2)j0m^>7k-#ahf.~q6vVϓoOβtɗ'Iݷ)+K 9s%OZ##-kõϗ{T:񇗏udR+ g^STŲ4gWRR9Lw>k\v>GEax?a6u? ˋ۳QR?i q?6Gj{`"pߋyV~}`uLX;_{ߌFt(cRZ8,«XJ e +4N=Jm4XJ :`S! py! ]z1K`ߙn~ZEvIZ(s_-8(Զ7 &{k&f1~po9}Z}bH' ]gBLc3-FyNHq?f1}zw s j%Y̲2͇.=^S.WWMOl؃ 5!V@|\~:D( ҩ[-J9PP#R xx3GN*lKh~C}46b3`L:jMjo% %SOR1YqP{>S:^'z2pzhnrw{}O,~?,O=[=w/[=ݲ#͹;jhLjxҬգs8т콳~P7} I-3v*\`g@@Qĩ^mpֆ oh J/ىD]  VXzF=S]envtF=otն7,]EyKÆK)f0" #01Yw?&7y.u*7sM@aߖM`^c TJ؏eSt/zEB"3(>T6a'9W\EgĖ-h?#+kL/<%iaH6O!8\qRg[ 2y˜ 2`=bk`klg6Ɯ JeMH~o4}L3CX`:[IՉ#{R`rr-+4,v(&ND"@a>CZ} c]#`q۰5mp×ukQ𕉑"#cߗH29=}:wstVmV@@R:4cxVG{>}Fa}=_:"))h=Ĩ-Wd"!ɜ-UT5ܟ9yu͜? R!Tqip"$0 e\<\JIѩm Xj2Q fq噒dBQgyIt(q;:$*u2XHKd_($5 ҂b+y2k X9Z+sl<>1eD"Bu6ҤX(Jy5;='Q+]#@LB<0֯>[`h"y*i^`5Z#պCѳ U+VCy& W !W93gp)5yaia(@FrWpl!"c̰Ʋ.XF%x1dwoگ꺮asE ,6H29̋ K@ajG9\ƈ̳‰/JUY +W-_8$,?C=A8oZ(WQ!uF@ Μ~jn3ɨsʁIP5i5E{VUӡdV&e1pdzJ;< TX`akPPa i\k:\>O]9.VeI\ @rօ!Zqk`q9gtQ3R`%'*^!8zb 0߮ScXyw nzz/ _*.Ș`@ %||ױBdVnʏ~wC!:gwC] 㢄fWƜn=&Ƙ Bh8|Nyx<`in2dWg#`  Oʤ"V88BZ)1_y`=[]wOwz9V4!S! [ wN9wN˘>Mb| 8"9ec7=߭xт{`M0 B+ {`Uhg'3u"\x`qibм( YFx6G)KQdRK%p>#O!s/蟁.!/uѮ4>*Pqf24'ZKe)\%Lo{s^cBݱ_X9cg&'@> =|eo+@c=.CR#䤏AO{zKΜ}>{Kl1/Cԁ+Tw{8c?^o0 Hwv|~!͍,5, cnI5YkM:] +LcMa91wTG6SIcs=d-֕1`qaNŪk\Çm^]KHijxӍvpӭKq?4(Rlm e\R&㺚\Ŵjeyy5Ҋ$Rt~D?#"!|MfTڡD$?jN~ % Jz\өLhCJ(y'9W~GjAN$C4LAHGw4n[ jcM+uh+W$ucrqeKD;M|GTK|/)-M. m[rMҩS[ubdwhRy:n]FL)՚uKG nuh+W-b:X?uة!#mj 5Becǩܡ#~.iuJҩd tWwmUu\DQ|1K&Ձք Ϫ _T=gj1 uFBEBRpc8dL uuOj$S詀ap)d0ArBxӀp{f׉`edƙmjpkQP{ѧ[ӭKq?5eDGH#"G2~sˑdS6$N̙oo73!?䬻SZyˆK:eDG@W/P,ai^ht9(V\v{AD̈́XČd𓩆-LPwMv2:AIʃ̞Lq1RpFv*oFHl63˲a}L_s yRgYs܇Tl?~vwn~gyf}f6rW&mw4="E Zz.%DD<]JBZW"qFJ=XPp%ELbXxgH ]ӧG$خERJfGyhVJѪKUX~JrG*o6],Dbe9M=4}$(Bb*{ /Y h|Y`1_2m!m҇oӇXflIJO!SXP33|Cu[|Mr6oV[&hĨ~KWSQ_Moopc0<_֓ {2 /. *ʕI$5Lсw1f#u%,F&qj~E94IY>PO/ʊL???˦&pxSCJ7H0#- u;xW:F~'Y!45_]aXnWL&PT_t _L,=ӟVkS@y!@fZD8B7e"L:#2X`oΫ@iܘxA|jPɈ"T "QVp-C2r4qiǹ- fsaH!OVh߮}z%xCVj4Ӄ|܁N,|ḑ_>Ǻ?$ן `9dzrwSW~흿Ժ{g#DGU3 t6_D0.!|{pwX3p010>L EwScV^Pb,}94UOqĵ'~8jC+*lV2 A Jb|zDZ- d" Nk<(h '?swXTqDәH|zTfWHeA".ɨ0JS;_% @1Ÿzw(&,{MehoBM/w5u`8e}OODt""`1&ʵn=`9/Qu` ;s5yPUV~Ԭ1VTCyntR3tX|W"՗;Tkz.xn39?rs])s{0j%7khҒznǽ+=zȃoW~:;:^J<,揋z3?}#g%]cЛg&ܟsM>U0&s h|5aå/J4o)W܂s^IІ :>ĸ^m< fy a;2¢2ў~!l<ɇWn̗5T`)•` u,SDdHY:)OKd/Vs^>-Qzld/|ZS ,gE۬Щ(}}a/,* <%(KV~hvg8ܟ`3%Z6W9Źťy/^Ryw)\cF㼰#2 &D1e&3kmq؂30]r)~var/home/core/zuul-output/logs/kubelet.log0000644000000000000000003744047015157200302017701 0ustar rootrootMar 20 07:12:42 crc systemd[1]: Starting Kubernetes Kubelet... Mar 20 07:12:42 crc restorecon[4691]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:42 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 07:12:43 crc restorecon[4691]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 07:12:43 crc restorecon[4691]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 20 07:12:43 crc kubenswrapper[4749]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 07:12:43 crc kubenswrapper[4749]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 07:12:43 crc kubenswrapper[4749]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 07:12:43 crc kubenswrapper[4749]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 07:12:43 crc kubenswrapper[4749]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 07:12:43 crc kubenswrapper[4749]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.897836 4749 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903234 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903276 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903296 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903334 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903342 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903351 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903358 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903367 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903378 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903389 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903416 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903428 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903437 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903446 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903455 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903464 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903473 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903480 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903488 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903496 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903504 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903511 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903522 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903530 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903537 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903544 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903552 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903560 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903570 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903579 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903586 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903594 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903601 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903610 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903617 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903624 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903632 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903639 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903647 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903655 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903664 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903671 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903679 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903687 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903695 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903703 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903712 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903719 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903727 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903735 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903742 4749 feature_gate.go:330] unrecognized feature gate: Example Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903749 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903757 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903765 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903773 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903780 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903788 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903795 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903803 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903811 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903821 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903828 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903836 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903843 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903851 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903859 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903866 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903873 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903881 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903888 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.903896 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.905799 4749 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.905828 4749 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.905845 4749 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.905856 4749 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.905868 4749 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.905877 4749 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.905889 4749 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.905900 4749 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.905910 4749 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.905919 4749 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.905929 4749 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.905942 4749 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.905957 4749 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.905968 4749 flags.go:64] FLAG: --cgroup-root="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.905979 4749 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.905990 4749 flags.go:64] FLAG: --client-ca-file="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906001 4749 flags.go:64] FLAG: --cloud-config="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906012 4749 flags.go:64] FLAG: --cloud-provider="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906022 4749 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906035 4749 flags.go:64] FLAG: --cluster-domain="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906046 4749 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906057 4749 flags.go:64] FLAG: --config-dir="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906068 4749 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906080 4749 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906093 4749 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906102 4749 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906111 4749 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906120 4749 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906129 4749 flags.go:64] FLAG: --contention-profiling="false" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906138 4749 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906147 4749 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906156 4749 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906175 4749 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906221 4749 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906231 4749 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906240 4749 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906249 4749 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906258 4749 flags.go:64] FLAG: --enable-server="true" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906266 4749 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906280 4749 flags.go:64] FLAG: --event-burst="100" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906289 4749 flags.go:64] FLAG: --event-qps="50" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906323 4749 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906333 4749 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906342 4749 flags.go:64] FLAG: --eviction-hard="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906359 4749 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906370 4749 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906382 4749 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906393 4749 flags.go:64] FLAG: --eviction-soft="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906402 4749 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906412 4749 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906423 4749 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906434 4749 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906444 4749 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906454 4749 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906462 4749 flags.go:64] FLAG: --feature-gates="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906473 4749 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906483 4749 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906491 4749 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906500 4749 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906509 4749 flags.go:64] FLAG: --healthz-port="10248" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906518 4749 flags.go:64] FLAG: --help="false" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906527 4749 flags.go:64] FLAG: --hostname-override="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906536 4749 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906545 4749 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906554 4749 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906564 4749 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906573 4749 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906582 4749 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906591 4749 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906600 4749 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906608 4749 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906617 4749 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906627 4749 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906636 4749 flags.go:64] FLAG: --kube-reserved="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906644 4749 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906652 4749 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906662 4749 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906671 4749 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906680 4749 flags.go:64] FLAG: --lock-file="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906689 4749 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906698 4749 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906707 4749 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906720 4749 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906729 4749 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906738 4749 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906747 4749 flags.go:64] FLAG: --logging-format="text" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906756 4749 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906765 4749 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906774 4749 flags.go:64] FLAG: --manifest-url="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906783 4749 flags.go:64] FLAG: --manifest-url-header="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906794 4749 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906803 4749 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906814 4749 flags.go:64] FLAG: --max-pods="110" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906823 4749 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906832 4749 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906841 4749 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906849 4749 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906858 4749 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906867 4749 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906876 4749 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906932 4749 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906942 4749 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906951 4749 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906961 4749 flags.go:64] FLAG: --pod-cidr="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906970 4749 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906982 4749 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.906991 4749 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907001 4749 flags.go:64] FLAG: --pods-per-core="0" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907011 4749 flags.go:64] FLAG: --port="10250" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907021 4749 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907030 4749 flags.go:64] FLAG: --provider-id="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907039 4749 flags.go:64] FLAG: --qos-reserved="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907047 4749 flags.go:64] FLAG: --read-only-port="10255" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907056 4749 flags.go:64] FLAG: --register-node="true" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907065 4749 flags.go:64] FLAG: --register-schedulable="true" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907074 4749 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907088 4749 flags.go:64] FLAG: --registry-burst="10" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907096 4749 flags.go:64] FLAG: --registry-qps="5" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907105 4749 flags.go:64] FLAG: --reserved-cpus="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907115 4749 flags.go:64] FLAG: --reserved-memory="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907126 4749 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907135 4749 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907144 4749 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907152 4749 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907161 4749 flags.go:64] FLAG: --runonce="false" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907170 4749 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907179 4749 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907188 4749 flags.go:64] FLAG: --seccomp-default="false" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907196 4749 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907205 4749 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907214 4749 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907223 4749 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907232 4749 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907241 4749 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907251 4749 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907261 4749 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907270 4749 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907286 4749 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907295 4749 flags.go:64] FLAG: --system-cgroups="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907331 4749 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907345 4749 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907355 4749 flags.go:64] FLAG: --tls-cert-file="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907364 4749 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907374 4749 flags.go:64] FLAG: --tls-min-version="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907382 4749 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907391 4749 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907400 4749 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907409 4749 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907418 4749 flags.go:64] FLAG: --v="2" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907428 4749 flags.go:64] FLAG: --version="false" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907440 4749 flags.go:64] FLAG: --vmodule="" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907450 4749 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.907460 4749 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907677 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907692 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907708 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907719 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907727 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907735 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907745 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907754 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907762 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907770 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907777 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907785 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907793 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907803 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907812 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907820 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907830 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907839 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907848 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907857 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907875 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907883 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907892 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907900 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907908 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907915 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907923 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907931 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907939 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907948 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907955 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907963 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907970 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907978 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907990 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.907997 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908005 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908012 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908021 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908029 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908037 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908044 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908052 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908059 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908067 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908076 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908084 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908093 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908100 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908108 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908119 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908129 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908141 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908150 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908158 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908166 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908175 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908182 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908190 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908198 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908207 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908215 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908223 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908230 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908238 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908246 4749 feature_gate.go:330] unrecognized feature gate: Example Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908255 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908263 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908271 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908285 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.908295 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.908345 4749 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.927353 4749 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.927401 4749 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927542 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927557 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927567 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927575 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927584 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927592 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927600 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927608 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927615 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927623 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927631 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927639 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927646 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927655 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927663 4749 feature_gate.go:330] unrecognized feature gate: Example Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927671 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927678 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927686 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927693 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927701 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927710 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927718 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927726 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927735 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927742 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927750 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927758 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927765 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927773 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927780 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927789 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927796 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927804 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927815 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927827 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927838 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927849 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927857 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927865 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927876 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927886 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927896 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927904 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927913 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927922 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927930 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927940 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927948 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927958 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927966 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927973 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927981 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927989 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.927997 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928005 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928012 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928020 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928028 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928036 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928044 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928052 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928060 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928067 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928075 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928082 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928091 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928101 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928112 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928120 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928129 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928137 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.928151 4749 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928579 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928601 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928613 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928621 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928629 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928638 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928646 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928657 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928667 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928676 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928685 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928693 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928702 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928711 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928719 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928727 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928735 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928743 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928751 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928758 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928767 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928775 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928783 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928791 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928798 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928807 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928815 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928823 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928833 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928843 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928854 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928865 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928873 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928882 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928890 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928897 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928906 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928913 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928921 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928929 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928937 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928945 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928953 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928960 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928968 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928975 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928983 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928991 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.928998 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.929006 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.929014 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.929021 4749 feature_gate.go:330] unrecognized feature gate: Example Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.929029 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.929038 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.929045 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.929053 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.929061 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.929069 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.929077 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.929085 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.929092 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.929100 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.929108 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.929116 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.929123 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.929131 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.929141 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.929151 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.929159 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.929168 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 07:12:43 crc kubenswrapper[4749]: W0320 07:12:43.929177 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.929189 4749 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.929508 4749 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 07:12:43 crc kubenswrapper[4749]: E0320 07:12:43.934069 4749 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.941623 4749 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.941770 4749 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.943994 4749 server.go:997] "Starting client certificate rotation" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.944041 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.944230 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.971518 4749 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.974732 4749 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 07:12:43 crc kubenswrapper[4749]: E0320 07:12:43.975332 4749 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 20 07:12:43 crc kubenswrapper[4749]: I0320 07:12:43.992480 4749 log.go:25] "Validated CRI v1 runtime API" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.031479 4749 log.go:25] "Validated CRI v1 image API" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.037339 4749 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.043374 4749 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-20-07-08-11-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.043420 4749 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.074822 4749 manager.go:217] Machine: {Timestamp:2026-03-20 07:12:44.071160529 +0000 UTC m=+0.620818266 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:42f570dd-c9b2-4d24-870f-033a21aa11c5 BootID:e6cbc31b-af36-4be8-8e88-99f024097007 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:5d:06:0b Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:5d:06:0b Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:16:88:f9 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:fa:27:2d Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:9f:e2:b4 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:3b:19:1c Speed:-1 Mtu:1496} {Name:eth10 MacAddress:b6:49:4a:4e:8d:fb Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:2e:a9:76:c8:bd:b3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.075239 4749 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.075587 4749 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.078760 4749 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.079269 4749 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.079410 4749 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.079872 4749 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.079899 4749 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.080624 4749 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.080698 4749 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.081073 4749 state_mem.go:36] "Initialized new in-memory state store" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.081248 4749 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.085832 4749 kubelet.go:418] "Attempting to sync node with API server" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.085898 4749 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.085962 4749 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.085985 4749 kubelet.go:324] "Adding apiserver pod source" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.086004 4749 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 07:12:44 crc kubenswrapper[4749]: W0320 07:12:44.089860 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 20 07:12:44 crc kubenswrapper[4749]: W0320 07:12:44.089914 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 20 07:12:44 crc kubenswrapper[4749]: E0320 07:12:44.090023 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 20 07:12:44 crc kubenswrapper[4749]: E0320 07:12:44.089966 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.091221 4749 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.092493 4749 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.094042 4749 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.096013 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.096058 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.096087 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.096102 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.096127 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.096140 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.096155 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.096177 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.096192 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.096206 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.096248 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.096262 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.098394 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.099183 4749 server.go:1280] "Started kubelet" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.100402 4749 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.100418 4749 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 07:12:44 crc systemd[1]: Started Kubernetes Kubelet. Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.101435 4749 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.101781 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.103674 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.103872 4749 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 07:12:44 crc kubenswrapper[4749]: E0320 07:12:44.104191 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.104265 4749 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.104347 4749 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.104484 4749 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 07:12:44 crc kubenswrapper[4749]: W0320 07:12:44.107907 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 20 07:12:44 crc kubenswrapper[4749]: E0320 07:12:44.108822 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.106461 4749 server.go:460] "Adding debug handlers to kubelet server" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.109602 4749 factory.go:55] Registering systemd factory Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.109642 4749 factory.go:221] Registration of the systemd container factory successfully Mar 20 07:12:44 crc kubenswrapper[4749]: E0320 07:12:44.108724 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.50:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e7b3370f3fe5d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.099141213 +0000 UTC m=+0.648798890,LastTimestamp:2026-03-20 07:12:44.099141213 +0000 UTC m=+0.648798890,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.110061 4749 factory.go:153] Registering CRI-O factory Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.110101 4749 factory.go:221] Registration of the crio container factory successfully Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.110231 4749 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.110276 4749 factory.go:103] Registering Raw factory Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.110344 4749 manager.go:1196] Started watching for new ooms in manager Mar 20 07:12:44 crc kubenswrapper[4749]: E0320 07:12:44.106642 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="200ms" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.112633 4749 manager.go:319] Starting recovery of all containers Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.125392 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.125467 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.125489 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.125509 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.125528 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.127880 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.127903 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.127926 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.127947 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.127966 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.127984 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128004 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128021 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128044 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128061 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128080 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128099 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128118 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128184 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128207 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128226 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128244 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128263 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128286 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128433 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128454 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128475 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128499 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128517 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128536 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128653 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128671 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128692 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128713 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128734 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128752 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128770 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128788 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128806 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128824 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128842 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128861 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128880 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128897 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128916 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128935 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128954 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128971 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.128991 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129010 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129029 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129047 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129072 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129093 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129113 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129135 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129155 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129173 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129192 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129211 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129239 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129258 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129284 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129342 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129364 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129383 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129400 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129421 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129439 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129458 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129475 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129492 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129509 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129527 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.129544 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.131825 4749 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.131872 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.131891 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.131909 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.131923 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.131940 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.131955 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.131969 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.131983 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.131995 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132008 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132019 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132032 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132045 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132057 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132070 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132082 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132097 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132109 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132121 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132134 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132147 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132160 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132171 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132184 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132196 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132210 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132222 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132239 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132255 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132285 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132326 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132346 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132388 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132409 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132426 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132445 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132465 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132483 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132500 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132517 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132534 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132548 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132561 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132573 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132587 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132600 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132612 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132625 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132639 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132651 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132663 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.132676 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133385 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133449 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133473 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133492 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133512 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133532 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133551 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133570 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133589 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133607 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133625 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133646 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133663 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133681 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133700 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133718 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133748 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133767 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133792 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133811 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133829 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133847 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133866 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133887 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133906 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133927 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133947 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133966 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.133987 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134005 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134023 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134042 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134063 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134082 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134099 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134117 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134135 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134155 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134175 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134194 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134212 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134230 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134249 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134267 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134296 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134355 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134377 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134396 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134415 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134434 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134453 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134471 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134490 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134509 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134525 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134542 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134559 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134576 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134596 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134623 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134647 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134669 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134693 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134717 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134744 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134766 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134784 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134802 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134818 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134834 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134854 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134872 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134889 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134906 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134925 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134943 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134960 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134978 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.134997 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.135016 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.135037 4749 reconstruct.go:97] "Volume reconstruction finished" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.135051 4749 reconciler.go:26] "Reconciler: start to sync state" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.145884 4749 manager.go:324] Recovery completed Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.163271 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.166265 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.166335 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.166351 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.168604 4749 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.168623 4749 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.168646 4749 state_mem.go:36] "Initialized new in-memory state store" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.172814 4749 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.175693 4749 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.175734 4749 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.175964 4749 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 07:12:44 crc kubenswrapper[4749]: E0320 07:12:44.176016 4749 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 07:12:44 crc kubenswrapper[4749]: W0320 07:12:44.176746 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 20 07:12:44 crc kubenswrapper[4749]: E0320 07:12:44.176850 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.185515 4749 policy_none.go:49] "None policy: Start" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.186527 4749 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.186555 4749 state_mem.go:35] "Initializing new in-memory state store" Mar 20 07:12:44 crc kubenswrapper[4749]: E0320 07:12:44.204433 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.257764 4749 manager.go:334] "Starting Device Plugin manager" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.257880 4749 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.257895 4749 server.go:79] "Starting device plugin registration server" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.259395 4749 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.259414 4749 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.259700 4749 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.259923 4749 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.259954 4749 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 07:12:44 crc kubenswrapper[4749]: E0320 07:12:44.267037 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.276820 4749 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.276976 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.278147 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.278202 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.278214 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.278468 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.278629 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.278673 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.279360 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.279389 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.279398 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.279424 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.279443 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.279453 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.279602 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.279844 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.279929 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.280916 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.280943 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.280953 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.281024 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.281354 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.281380 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.281929 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.281950 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.281957 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.281936 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.281986 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.282020 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.282046 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.282050 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.282123 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.282029 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.282346 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.282375 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.283052 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.283087 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.283099 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.283233 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.283254 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.283597 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.283628 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.283643 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.284360 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.284384 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.284395 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:44 crc kubenswrapper[4749]: E0320 07:12:44.312972 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="400ms" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.337021 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.337091 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.337130 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.337196 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.337228 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.337259 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.337324 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.337354 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.337384 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.337411 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.337442 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.337472 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.337501 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.337529 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.337559 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.359529 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.360666 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.360769 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.360792 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.360833 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 07:12:44 crc kubenswrapper[4749]: E0320 07:12:44.361457 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.50:6443: connect: connection refused" node="crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.438707 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.438767 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.438800 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.438831 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.438864 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.438895 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.438923 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.438954 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.438986 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.439016 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.439013 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.439042 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.439073 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.439101 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.439130 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.439134 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.439158 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.439172 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.439199 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.439252 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.439253 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.439038 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.439341 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.439396 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.439403 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.439092 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.439443 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.439134 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.439564 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.439566 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.561823 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.563714 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.563767 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.563784 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.563816 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 07:12:44 crc kubenswrapper[4749]: E0320 07:12:44.564383 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.50:6443: connect: connection refused" node="crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.605114 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.630115 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: W0320 07:12:44.650922 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-412fb7f1c2d8a99ecd20f3f42a7e80104f252ae72a1b92c0d2c6ff210238587f WatchSource:0}: Error finding container 412fb7f1c2d8a99ecd20f3f42a7e80104f252ae72a1b92c0d2c6ff210238587f: Status 404 returned error can't find the container with id 412fb7f1c2d8a99ecd20f3f42a7e80104f252ae72a1b92c0d2c6ff210238587f Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.658613 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: W0320 07:12:44.679423 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-6ccd8959d5986d59b545f33443d0226f113dd6eee7ea34c34e45eb36ba5aa9bc WatchSource:0}: Error finding container 6ccd8959d5986d59b545f33443d0226f113dd6eee7ea34c34e45eb36ba5aa9bc: Status 404 returned error can't find the container with id 6ccd8959d5986d59b545f33443d0226f113dd6eee7ea34c34e45eb36ba5aa9bc Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.682047 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.687431 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 07:12:44 crc kubenswrapper[4749]: W0320 07:12:44.708646 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-59751de54f9c0998d6ec0a734bcff61993818a3fa8a3c7f636d0dac2b4b87afa WatchSource:0}: Error finding container 59751de54f9c0998d6ec0a734bcff61993818a3fa8a3c7f636d0dac2b4b87afa: Status 404 returned error can't find the container with id 59751de54f9c0998d6ec0a734bcff61993818a3fa8a3c7f636d0dac2b4b87afa Mar 20 07:12:44 crc kubenswrapper[4749]: E0320 07:12:44.714004 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="800ms" Mar 20 07:12:44 crc kubenswrapper[4749]: W0320 07:12:44.717359 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-924afcbfff54983afc105eddf30c0fb9c1f551c941559dca1715aef254f46a8d WatchSource:0}: Error finding container 924afcbfff54983afc105eddf30c0fb9c1f551c941559dca1715aef254f46a8d: Status 404 returned error can't find the container with id 924afcbfff54983afc105eddf30c0fb9c1f551c941559dca1715aef254f46a8d Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.965261 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.967203 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.967254 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.967269 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:44 crc kubenswrapper[4749]: I0320 07:12:44.967332 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 07:12:44 crc kubenswrapper[4749]: E0320 07:12:44.967879 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.50:6443: connect: connection refused" node="crc" Mar 20 07:12:45 crc kubenswrapper[4749]: W0320 07:12:45.103000 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 20 07:12:45 crc kubenswrapper[4749]: E0320 07:12:45.103096 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 20 07:12:45 crc kubenswrapper[4749]: I0320 07:12:45.103433 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 20 07:12:45 crc kubenswrapper[4749]: W0320 07:12:45.137644 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 20 07:12:45 crc kubenswrapper[4749]: E0320 07:12:45.137766 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 20 07:12:45 crc kubenswrapper[4749]: I0320 07:12:45.180823 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6ccd8959d5986d59b545f33443d0226f113dd6eee7ea34c34e45eb36ba5aa9bc"} Mar 20 07:12:45 crc kubenswrapper[4749]: I0320 07:12:45.181998 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"21607fa0c91c0b3cd4cf89c91b75787e6130ebdb6074c1e8d1f0fc7a62353281"} Mar 20 07:12:45 crc kubenswrapper[4749]: I0320 07:12:45.183199 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"412fb7f1c2d8a99ecd20f3f42a7e80104f252ae72a1b92c0d2c6ff210238587f"} Mar 20 07:12:45 crc kubenswrapper[4749]: I0320 07:12:45.184321 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"924afcbfff54983afc105eddf30c0fb9c1f551c941559dca1715aef254f46a8d"} Mar 20 07:12:45 crc kubenswrapper[4749]: I0320 07:12:45.185656 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"59751de54f9c0998d6ec0a734bcff61993818a3fa8a3c7f636d0dac2b4b87afa"} Mar 20 07:12:45 crc kubenswrapper[4749]: W0320 07:12:45.357500 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 20 07:12:45 crc kubenswrapper[4749]: E0320 07:12:45.357608 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 20 07:12:45 crc kubenswrapper[4749]: W0320 07:12:45.440012 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 20 07:12:45 crc kubenswrapper[4749]: E0320 07:12:45.440157 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 20 07:12:45 crc kubenswrapper[4749]: E0320 07:12:45.515821 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="1.6s" Mar 20 07:12:45 crc kubenswrapper[4749]: I0320 07:12:45.768748 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:45 crc kubenswrapper[4749]: I0320 07:12:45.770494 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:45 crc kubenswrapper[4749]: I0320 07:12:45.770544 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:45 crc kubenswrapper[4749]: I0320 07:12:45.770561 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:45 crc kubenswrapper[4749]: I0320 07:12:45.770595 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 07:12:45 crc kubenswrapper[4749]: E0320 07:12:45.771172 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.50:6443: connect: connection refused" node="crc" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.101581 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 07:12:46 crc kubenswrapper[4749]: E0320 07:12:46.102813 4749 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.102906 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.192484 4749 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="830983de883dd8f7cd7c3da3c23b2d33e795b3c75222381378c17d43f8fb435f" exitCode=0 Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.192614 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"830983de883dd8f7cd7c3da3c23b2d33e795b3c75222381378c17d43f8fb435f"} Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.192662 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.193821 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.193877 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.193894 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.196172 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7" exitCode=0 Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.196273 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7"} Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.196715 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.198198 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.198253 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.198270 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.200835 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.202467 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"eb647173cc4db7ee41f0b9e18c802b8d111c5f0819c619960b4b29f4f698e558"} Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.202563 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"92f4faf3fc2d9dbe3235934b4065feba40ae56c54cf67e5792a9183f56014fd5"} Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.202588 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2bf150976cb265b706cccb6e625a9a0a06d47f2dbd69d032957551966e691e43"} Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.202607 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1ee1edb21f5116ef152f5808824f0529ac3bc52a3959df7a21c031da45b5284a"} Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.202604 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.203080 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.203122 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.203143 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.204464 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.204753 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.204869 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.205202 4749 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347" exitCode=0 Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.205277 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347"} Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.205327 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.206495 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.206526 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.206543 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.209130 4749 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171" exitCode=0 Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.209182 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171"} Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.209206 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.210548 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.210626 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:46 crc kubenswrapper[4749]: I0320 07:12:46.210652 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.102856 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 20 07:12:47 crc kubenswrapper[4749]: E0320 07:12:47.122412 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="3.2s" Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.211566 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"3db71fc201b999f26a4841d7cff88cd6c415d1a2ad4920d354ed394ac8ad2982"} Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.211663 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.212751 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.212776 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.212783 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.215215 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b9041a702c52186ffb23b29e5a5ddeddefef6f576571a20f1f43027ff3225641"} Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.215238 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"189ab91bc96e9893f362ea6fae4ae81880b230b84a3987760796360150187043"} Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.215249 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"93eaff3eb0b1240b3d19fdd70f9a27c0543e794b7bb61e3e1886807a8a712758"} Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.215274 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.216733 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.216770 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.216778 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.221004 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3"} Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.221041 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6"} Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.221057 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc"} Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.221082 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e"} Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.225858 4749 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915" exitCode=0 Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.226181 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915"} Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.226397 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.226420 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.227697 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.227741 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.227757 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.228036 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.228147 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.228372 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.371863 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.374457 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.374506 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.374524 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:47 crc kubenswrapper[4749]: I0320 07:12:47.374559 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 07:12:47 crc kubenswrapper[4749]: E0320 07:12:47.375018 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.50:6443: connect: connection refused" node="crc" Mar 20 07:12:47 crc kubenswrapper[4749]: W0320 07:12:47.479499 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.50:6443: connect: connection refused Mar 20 07:12:47 crc kubenswrapper[4749]: E0320 07:12:47.479602 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.50:6443: connect: connection refused" logger="UnhandledError" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.023252 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.033201 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.232480 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f8efe115d35b0dfdc49fce523a818b421749f312cc5ebbbb81772e727f791ef1"} Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.232551 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.233716 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.233775 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.233795 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.236669 4749 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8" exitCode=0 Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.236809 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.236864 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.236893 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.236796 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8"} Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.237064 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.237405 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.238737 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.238779 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.238797 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.238802 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.238837 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.238859 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.239052 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.239089 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.239106 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.239687 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.239756 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.239783 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.277139 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:12:48 crc kubenswrapper[4749]: I0320 07:12:48.813032 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 07:12:49 crc kubenswrapper[4749]: I0320 07:12:49.246725 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 07:12:49 crc kubenswrapper[4749]: I0320 07:12:49.246810 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:49 crc kubenswrapper[4749]: I0320 07:12:49.246843 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e"} Mar 20 07:12:49 crc kubenswrapper[4749]: I0320 07:12:49.246890 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 07:12:49 crc kubenswrapper[4749]: I0320 07:12:49.246919 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2"} Mar 20 07:12:49 crc kubenswrapper[4749]: I0320 07:12:49.246953 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a"} Mar 20 07:12:49 crc kubenswrapper[4749]: I0320 07:12:49.246957 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:49 crc kubenswrapper[4749]: I0320 07:12:49.247020 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:49 crc kubenswrapper[4749]: I0320 07:12:49.248818 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:49 crc kubenswrapper[4749]: I0320 07:12:49.248837 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:49 crc kubenswrapper[4749]: I0320 07:12:49.248883 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:49 crc kubenswrapper[4749]: I0320 07:12:49.248908 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:49 crc kubenswrapper[4749]: I0320 07:12:49.248908 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:49 crc kubenswrapper[4749]: I0320 07:12:49.248962 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:49 crc kubenswrapper[4749]: I0320 07:12:49.248982 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:49 crc kubenswrapper[4749]: I0320 07:12:49.248925 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:49 crc kubenswrapper[4749]: I0320 07:12:49.249032 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:50 crc kubenswrapper[4749]: I0320 07:12:50.179366 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:12:50 crc kubenswrapper[4749]: I0320 07:12:50.255070 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3"} Mar 20 07:12:50 crc kubenswrapper[4749]: I0320 07:12:50.255131 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d"} Mar 20 07:12:50 crc kubenswrapper[4749]: I0320 07:12:50.255151 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 07:12:50 crc kubenswrapper[4749]: I0320 07:12:50.255206 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 07:12:50 crc kubenswrapper[4749]: I0320 07:12:50.255234 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:50 crc kubenswrapper[4749]: I0320 07:12:50.255267 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:50 crc kubenswrapper[4749]: I0320 07:12:50.255333 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:50 crc kubenswrapper[4749]: I0320 07:12:50.257060 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:50 crc kubenswrapper[4749]: I0320 07:12:50.257115 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:50 crc kubenswrapper[4749]: I0320 07:12:50.257117 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:50 crc kubenswrapper[4749]: I0320 07:12:50.257160 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:50 crc kubenswrapper[4749]: I0320 07:12:50.257183 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:50 crc kubenswrapper[4749]: I0320 07:12:50.257132 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:50 crc kubenswrapper[4749]: I0320 07:12:50.257973 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:50 crc kubenswrapper[4749]: I0320 07:12:50.258025 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:50 crc kubenswrapper[4749]: I0320 07:12:50.258041 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:50 crc kubenswrapper[4749]: I0320 07:12:50.455070 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 07:12:50 crc kubenswrapper[4749]: I0320 07:12:50.539038 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 07:12:50 crc kubenswrapper[4749]: I0320 07:12:50.575446 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:50 crc kubenswrapper[4749]: I0320 07:12:50.577138 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:50 crc kubenswrapper[4749]: I0320 07:12:50.577213 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:50 crc kubenswrapper[4749]: I0320 07:12:50.577234 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:50 crc kubenswrapper[4749]: I0320 07:12:50.577271 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 07:12:51 crc kubenswrapper[4749]: I0320 07:12:51.261797 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 07:12:51 crc kubenswrapper[4749]: I0320 07:12:51.261843 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:51 crc kubenswrapper[4749]: I0320 07:12:51.261946 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:51 crc kubenswrapper[4749]: I0320 07:12:51.261960 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:51 crc kubenswrapper[4749]: I0320 07:12:51.262854 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:51 crc kubenswrapper[4749]: I0320 07:12:51.262887 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:51 crc kubenswrapper[4749]: I0320 07:12:51.262898 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:51 crc kubenswrapper[4749]: I0320 07:12:51.263469 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:51 crc kubenswrapper[4749]: I0320 07:12:51.263532 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:51 crc kubenswrapper[4749]: I0320 07:12:51.263560 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:51 crc kubenswrapper[4749]: I0320 07:12:51.263614 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:51 crc kubenswrapper[4749]: I0320 07:12:51.263654 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:51 crc kubenswrapper[4749]: I0320 07:12:51.263671 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:51 crc kubenswrapper[4749]: I0320 07:12:51.767620 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:12:51 crc kubenswrapper[4749]: I0320 07:12:51.813141 4749 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 07:12:51 crc kubenswrapper[4749]: I0320 07:12:51.813256 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 07:12:52 crc kubenswrapper[4749]: I0320 07:12:52.264346 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:52 crc kubenswrapper[4749]: I0320 07:12:52.265671 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:52 crc kubenswrapper[4749]: I0320 07:12:52.265706 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:52 crc kubenswrapper[4749]: I0320 07:12:52.265718 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:53 crc kubenswrapper[4749]: I0320 07:12:53.200363 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 07:12:53 crc kubenswrapper[4749]: I0320 07:12:53.200561 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:53 crc kubenswrapper[4749]: I0320 07:12:53.201992 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:53 crc kubenswrapper[4749]: I0320 07:12:53.202051 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:53 crc kubenswrapper[4749]: I0320 07:12:53.202072 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:54 crc kubenswrapper[4749]: E0320 07:12:54.267183 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 07:12:55 crc kubenswrapper[4749]: I0320 07:12:55.240706 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 20 07:12:55 crc kubenswrapper[4749]: I0320 07:12:55.240897 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:55 crc kubenswrapper[4749]: I0320 07:12:55.243798 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:55 crc kubenswrapper[4749]: I0320 07:12:55.243882 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:55 crc kubenswrapper[4749]: I0320 07:12:55.243965 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:55 crc kubenswrapper[4749]: I0320 07:12:55.745440 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 20 07:12:55 crc kubenswrapper[4749]: I0320 07:12:55.745686 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:55 crc kubenswrapper[4749]: I0320 07:12:55.747397 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:55 crc kubenswrapper[4749]: I0320 07:12:55.747466 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:55 crc kubenswrapper[4749]: I0320 07:12:55.747490 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:57 crc kubenswrapper[4749]: W0320 07:12:57.778267 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 07:12:57 crc kubenswrapper[4749]: I0320 07:12:57.778486 4749 trace.go:236] Trace[2104605450]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 07:12:47.776) (total time: 10002ms): Mar 20 07:12:57 crc kubenswrapper[4749]: Trace[2104605450]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (07:12:57.778) Mar 20 07:12:57 crc kubenswrapper[4749]: Trace[2104605450]: [10.00204531s] [10.00204531s] END Mar 20 07:12:57 crc kubenswrapper[4749]: E0320 07:12:57.778538 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 07:12:58 crc kubenswrapper[4749]: W0320 07:12:58.075201 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 07:12:58 crc kubenswrapper[4749]: I0320 07:12:58.075328 4749 trace.go:236] Trace[424418054]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 07:12:48.073) (total time: 10001ms): Mar 20 07:12:58 crc kubenswrapper[4749]: Trace[424418054]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (07:12:58.075) Mar 20 07:12:58 crc kubenswrapper[4749]: Trace[424418054]: [10.001455246s] [10.001455246s] END Mar 20 07:12:58 crc kubenswrapper[4749]: E0320 07:12:58.075356 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 07:12:58 crc kubenswrapper[4749]: I0320 07:12:58.103427 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 20 07:12:58 crc kubenswrapper[4749]: W0320 07:12:58.184592 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 07:12:58 crc kubenswrapper[4749]: I0320 07:12:58.184724 4749 trace.go:236] Trace[94580551]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 07:12:48.182) (total time: 10001ms): Mar 20 07:12:58 crc kubenswrapper[4749]: Trace[94580551]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (07:12:58.184) Mar 20 07:12:58 crc kubenswrapper[4749]: Trace[94580551]: [10.001778824s] [10.001778824s] END Mar 20 07:12:58 crc kubenswrapper[4749]: E0320 07:12:58.184759 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 07:12:58 crc kubenswrapper[4749]: E0320 07:12:58.547385 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.189e7b3370f3fe5d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.099141213 +0000 UTC m=+0.648798890,LastTimestamp:2026-03-20 07:12:44.099141213 +0000 UTC m=+0.648798890,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:12:58 crc kubenswrapper[4749]: E0320 07:12:58.728066 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:12:58Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 20 07:12:58 crc kubenswrapper[4749]: E0320 07:12:58.730056 4749 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:12:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 07:12:58 crc kubenswrapper[4749]: E0320 07:12:58.730979 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:12:58Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 07:12:58 crc kubenswrapper[4749]: W0320 07:12:58.736201 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:12:58Z is after 2026-02-23T05:33:13Z Mar 20 07:12:58 crc kubenswrapper[4749]: E0320 07:12:58.736266 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:12:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 07:12:58 crc kubenswrapper[4749]: I0320 07:12:58.738610 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 07:12:58 crc kubenswrapper[4749]: I0320 07:12:58.738685 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 07:12:58 crc kubenswrapper[4749]: I0320 07:12:58.743877 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 07:12:58 crc kubenswrapper[4749]: I0320 07:12:58.743961 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 07:12:58 crc kubenswrapper[4749]: I0320 07:12:58.827854 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 20 07:12:58 crc kubenswrapper[4749]: I0320 07:12:58.827922 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 20 07:12:59 crc kubenswrapper[4749]: I0320 07:12:59.107158 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:12:59Z is after 2026-02-23T05:33:13Z Mar 20 07:12:59 crc kubenswrapper[4749]: I0320 07:12:59.285272 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 07:12:59 crc kubenswrapper[4749]: I0320 07:12:59.288313 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f8efe115d35b0dfdc49fce523a818b421749f312cc5ebbbb81772e727f791ef1" exitCode=255 Mar 20 07:12:59 crc kubenswrapper[4749]: I0320 07:12:59.288317 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f8efe115d35b0dfdc49fce523a818b421749f312cc5ebbbb81772e727f791ef1"} Mar 20 07:12:59 crc kubenswrapper[4749]: I0320 07:12:59.288553 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:12:59 crc kubenswrapper[4749]: I0320 07:12:59.289814 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:12:59 crc kubenswrapper[4749]: I0320 07:12:59.289846 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:12:59 crc kubenswrapper[4749]: I0320 07:12:59.289856 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:12:59 crc kubenswrapper[4749]: I0320 07:12:59.290320 4749 scope.go:117] "RemoveContainer" containerID="f8efe115d35b0dfdc49fce523a818b421749f312cc5ebbbb81772e727f791ef1" Mar 20 07:13:00 crc kubenswrapper[4749]: I0320 07:13:00.107488 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:13:00Z is after 2026-02-23T05:33:13Z Mar 20 07:13:00 crc kubenswrapper[4749]: I0320 07:13:00.185612 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:13:00 crc kubenswrapper[4749]: I0320 07:13:00.294085 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 07:13:00 crc kubenswrapper[4749]: I0320 07:13:00.296197 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c9eecc22366973889992af3b26b7400e6175eba272ddedd64b11d42a16d538af"} Mar 20 07:13:00 crc kubenswrapper[4749]: I0320 07:13:00.296390 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:00 crc kubenswrapper[4749]: I0320 07:13:00.297480 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:00 crc kubenswrapper[4749]: I0320 07:13:00.297523 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:00 crc kubenswrapper[4749]: I0320 07:13:00.297539 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:00 crc kubenswrapper[4749]: I0320 07:13:00.303223 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:13:00 crc kubenswrapper[4749]: I0320 07:13:00.545811 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 07:13:00 crc kubenswrapper[4749]: I0320 07:13:00.546030 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:00 crc kubenswrapper[4749]: I0320 07:13:00.547643 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:00 crc kubenswrapper[4749]: I0320 07:13:00.547854 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:00 crc kubenswrapper[4749]: I0320 07:13:00.547997 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:01 crc kubenswrapper[4749]: I0320 07:13:01.107675 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:13:01Z is after 2026-02-23T05:33:13Z Mar 20 07:13:01 crc kubenswrapper[4749]: I0320 07:13:01.300889 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 07:13:01 crc kubenswrapper[4749]: I0320 07:13:01.301498 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 07:13:01 crc kubenswrapper[4749]: I0320 07:13:01.303403 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c9eecc22366973889992af3b26b7400e6175eba272ddedd64b11d42a16d538af" exitCode=255 Mar 20 07:13:01 crc kubenswrapper[4749]: I0320 07:13:01.303459 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c9eecc22366973889992af3b26b7400e6175eba272ddedd64b11d42a16d538af"} Mar 20 07:13:01 crc kubenswrapper[4749]: I0320 07:13:01.303513 4749 scope.go:117] "RemoveContainer" containerID="f8efe115d35b0dfdc49fce523a818b421749f312cc5ebbbb81772e727f791ef1" Mar 20 07:13:01 crc kubenswrapper[4749]: I0320 07:13:01.303555 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:01 crc kubenswrapper[4749]: I0320 07:13:01.305157 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:01 crc kubenswrapper[4749]: I0320 07:13:01.305212 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:01 crc kubenswrapper[4749]: I0320 07:13:01.305223 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:01 crc kubenswrapper[4749]: I0320 07:13:01.305695 4749 scope.go:117] "RemoveContainer" containerID="c9eecc22366973889992af3b26b7400e6175eba272ddedd64b11d42a16d538af" Mar 20 07:13:01 crc kubenswrapper[4749]: E0320 07:13:01.305912 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 07:13:01 crc kubenswrapper[4749]: W0320 07:13:01.555062 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:13:01Z is after 2026-02-23T05:33:13Z Mar 20 07:13:01 crc kubenswrapper[4749]: E0320 07:13:01.555169 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:13:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 07:13:01 crc kubenswrapper[4749]: I0320 07:13:01.768560 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:13:01 crc kubenswrapper[4749]: I0320 07:13:01.814150 4749 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 07:13:01 crc kubenswrapper[4749]: I0320 07:13:01.814255 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 07:13:02 crc kubenswrapper[4749]: I0320 07:13:02.106730 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:13:02Z is after 2026-02-23T05:33:13Z Mar 20 07:13:02 crc kubenswrapper[4749]: W0320 07:13:02.147628 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:13:02Z is after 2026-02-23T05:33:13Z Mar 20 07:13:02 crc kubenswrapper[4749]: E0320 07:13:02.147726 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:13:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 07:13:02 crc kubenswrapper[4749]: I0320 07:13:02.308267 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 07:13:02 crc kubenswrapper[4749]: I0320 07:13:02.310996 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:02 crc kubenswrapper[4749]: I0320 07:13:02.312264 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:02 crc kubenswrapper[4749]: I0320 07:13:02.312351 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:02 crc kubenswrapper[4749]: I0320 07:13:02.312368 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:02 crc kubenswrapper[4749]: I0320 07:13:02.313097 4749 scope.go:117] "RemoveContainer" containerID="c9eecc22366973889992af3b26b7400e6175eba272ddedd64b11d42a16d538af" Mar 20 07:13:02 crc kubenswrapper[4749]: E0320 07:13:02.313414 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 07:13:03 crc kubenswrapper[4749]: I0320 07:13:03.107492 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:13:03Z is after 2026-02-23T05:33:13Z Mar 20 07:13:03 crc kubenswrapper[4749]: I0320 07:13:03.313670 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:03 crc kubenswrapper[4749]: I0320 07:13:03.314991 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:03 crc kubenswrapper[4749]: I0320 07:13:03.315057 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:03 crc kubenswrapper[4749]: I0320 07:13:03.315081 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:03 crc kubenswrapper[4749]: I0320 07:13:03.315971 4749 scope.go:117] "RemoveContainer" containerID="c9eecc22366973889992af3b26b7400e6175eba272ddedd64b11d42a16d538af" Mar 20 07:13:03 crc kubenswrapper[4749]: E0320 07:13:03.316275 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 07:13:03 crc kubenswrapper[4749]: W0320 07:13:03.982428 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:13:03Z is after 2026-02-23T05:33:13Z Mar 20 07:13:03 crc kubenswrapper[4749]: E0320 07:13:03.982525 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:13:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 07:13:04 crc kubenswrapper[4749]: I0320 07:13:04.108111 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:13:04Z is after 2026-02-23T05:33:13Z Mar 20 07:13:04 crc kubenswrapper[4749]: E0320 07:13:04.267400 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 07:13:05 crc kubenswrapper[4749]: I0320 07:13:05.108782 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:13:05Z is after 2026-02-23T05:33:13Z Mar 20 07:13:05 crc kubenswrapper[4749]: I0320 07:13:05.131148 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:05 crc kubenswrapper[4749]: I0320 07:13:05.132821 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:05 crc kubenswrapper[4749]: I0320 07:13:05.132885 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:05 crc kubenswrapper[4749]: I0320 07:13:05.132904 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:05 crc kubenswrapper[4749]: I0320 07:13:05.132942 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 07:13:05 crc kubenswrapper[4749]: E0320 07:13:05.133624 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:13:05Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 07:13:05 crc kubenswrapper[4749]: E0320 07:13:05.137766 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:13:05Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 07:13:05 crc kubenswrapper[4749]: W0320 07:13:05.536366 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:13:05Z is after 2026-02-23T05:33:13Z Mar 20 07:13:05 crc kubenswrapper[4749]: E0320 07:13:05.536432 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:13:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 07:13:05 crc kubenswrapper[4749]: I0320 07:13:05.782817 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 20 07:13:05 crc kubenswrapper[4749]: I0320 07:13:05.783001 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:05 crc kubenswrapper[4749]: I0320 07:13:05.784088 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:05 crc kubenswrapper[4749]: I0320 07:13:05.784149 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:05 crc kubenswrapper[4749]: I0320 07:13:05.784159 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:05 crc kubenswrapper[4749]: I0320 07:13:05.801922 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 20 07:13:06 crc kubenswrapper[4749]: I0320 07:13:06.109403 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:06 crc kubenswrapper[4749]: I0320 07:13:06.320363 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:06 crc kubenswrapper[4749]: I0320 07:13:06.324435 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:06 crc kubenswrapper[4749]: I0320 07:13:06.324488 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:06 crc kubenswrapper[4749]: I0320 07:13:06.324507 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:07 crc kubenswrapper[4749]: I0320 07:13:07.037248 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 07:13:07 crc kubenswrapper[4749]: I0320 07:13:07.054669 4749 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 07:13:07 crc kubenswrapper[4749]: I0320 07:13:07.108549 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:08 crc kubenswrapper[4749]: I0320 07:13:08.109420 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.554487 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b3370f3fe5d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.099141213 +0000 UTC m=+0.648798890,LastTimestamp:2026-03-20 07:12:44.099141213 +0000 UTC m=+0.648798890,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.561619 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b3374f502f7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.166316791 +0000 UTC m=+0.715974458,LastTimestamp:2026-03-20 07:12:44.166316791 +0000 UTC m=+0.715974458,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.566895 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b3374f57580 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.166346112 +0000 UTC m=+0.716003769,LastTimestamp:2026-03-20 07:12:44.166346112 +0000 UTC m=+0.716003769,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.573071 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b3374f5a2da default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.166357722 +0000 UTC m=+0.716015379,LastTimestamp:2026-03-20 07:12:44.166357722 +0000 UTC m=+0.716015379,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.579859 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b337b714bda default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.27512521 +0000 UTC m=+0.824782857,LastTimestamp:2026-03-20 07:12:44.27512521 +0000 UTC m=+0.824782857,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.587084 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7b3374f502f7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b3374f502f7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.166316791 +0000 UTC m=+0.715974458,LastTimestamp:2026-03-20 07:12:44.278191304 +0000 UTC m=+0.827848961,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.593712 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7b3374f57580\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b3374f57580 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.166346112 +0000 UTC m=+0.716003769,LastTimestamp:2026-03-20 07:12:44.278210224 +0000 UTC m=+0.827867881,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.599852 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7b3374f5a2da\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b3374f5a2da default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.166357722 +0000 UTC m=+0.716015379,LastTimestamp:2026-03-20 07:12:44.278220573 +0000 UTC m=+0.827878230,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.606364 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7b3374f502f7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b3374f502f7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.166316791 +0000 UTC m=+0.715974458,LastTimestamp:2026-03-20 07:12:44.279378078 +0000 UTC m=+0.829035725,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.612835 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7b3374f57580\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b3374f57580 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.166346112 +0000 UTC m=+0.716003769,LastTimestamp:2026-03-20 07:12:44.279395448 +0000 UTC m=+0.829053095,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.620025 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7b3374f5a2da\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b3374f5a2da default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.166357722 +0000 UTC m=+0.716015379,LastTimestamp:2026-03-20 07:12:44.279403607 +0000 UTC m=+0.829061254,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.626696 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7b3374f502f7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b3374f502f7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.166316791 +0000 UTC m=+0.715974458,LastTimestamp:2026-03-20 07:12:44.279437016 +0000 UTC m=+0.829094673,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.633157 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7b3374f57580\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b3374f57580 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.166346112 +0000 UTC m=+0.716003769,LastTimestamp:2026-03-20 07:12:44.279449365 +0000 UTC m=+0.829107022,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.639745 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7b3374f5a2da\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b3374f5a2da default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.166357722 +0000 UTC m=+0.716015379,LastTimestamp:2026-03-20 07:12:44.279459675 +0000 UTC m=+0.829117332,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.646642 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7b3374f502f7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b3374f502f7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.166316791 +0000 UTC m=+0.715974458,LastTimestamp:2026-03-20 07:12:44.280937149 +0000 UTC m=+0.830594796,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.653349 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7b3374f57580\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b3374f57580 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.166346112 +0000 UTC m=+0.716003769,LastTimestamp:2026-03-20 07:12:44.280950509 +0000 UTC m=+0.830608156,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.659865 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7b3374f5a2da\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b3374f5a2da default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.166357722 +0000 UTC m=+0.716015379,LastTimestamp:2026-03-20 07:12:44.280958239 +0000 UTC m=+0.830615876,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.666760 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7b3374f502f7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b3374f502f7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.166316791 +0000 UTC m=+0.715974458,LastTimestamp:2026-03-20 07:12:44.2819456 +0000 UTC m=+0.831603247,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.670148 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7b3374f57580\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b3374f57580 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.166346112 +0000 UTC m=+0.716003769,LastTimestamp:2026-03-20 07:12:44.28195491 +0000 UTC m=+0.831612547,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.673716 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7b3374f5a2da\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b3374f5a2da default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.166357722 +0000 UTC m=+0.716015379,LastTimestamp:2026-03-20 07:12:44.28196233 +0000 UTC m=+0.831619967,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.677077 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7b3374f502f7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b3374f502f7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.166316791 +0000 UTC m=+0.715974458,LastTimestamp:2026-03-20 07:12:44.281979239 +0000 UTC m=+0.831636886,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.686128 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7b3374f57580\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b3374f57580 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.166346112 +0000 UTC m=+0.716003769,LastTimestamp:2026-03-20 07:12:44.281992829 +0000 UTC m=+0.831650476,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.692370 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7b3374f502f7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b3374f502f7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.166316791 +0000 UTC m=+0.715974458,LastTimestamp:2026-03-20 07:12:44.282036907 +0000 UTC m=+0.831694564,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.698243 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7b3374f57580\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b3374f57580 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.166346112 +0000 UTC m=+0.716003769,LastTimestamp:2026-03-20 07:12:44.282106984 +0000 UTC m=+0.831764641,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.704551 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7b3374f5a2da\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7b3374f5a2da default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.166357722 +0000 UTC m=+0.716015379,LastTimestamp:2026-03-20 07:12:44.282126683 +0000 UTC m=+0.831784330,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.711651 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7b3392805245 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.661985861 +0000 UTC m=+1.211643518,LastTimestamp:2026-03-20 07:12:44.661985861 +0000 UTC m=+1.211643518,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.718092 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7b3393075a71 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.670835313 +0000 UTC m=+1.220492970,LastTimestamp:2026-03-20 07:12:44.670835313 +0000 UTC m=+1.220492970,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.724211 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7b3393c89be9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.683500521 +0000 UTC m=+1.233158178,LastTimestamp:2026-03-20 07:12:44.683500521 +0000 UTC m=+1.233158178,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.730423 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e7b3395ad7a59 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.715276889 +0000 UTC m=+1.264934546,LastTimestamp:2026-03-20 07:12:44.715276889 +0000 UTC m=+1.264934546,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.736597 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e7b3396112122 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:44.72180765 +0000 UTC m=+1.271465307,LastTimestamp:2026-03-20 07:12:44.72180765 +0000 UTC m=+1.271465307,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.743686 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7b33b83510d0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:45.294588112 +0000 UTC m=+1.844245789,LastTimestamp:2026-03-20 07:12:45.294588112 +0000 UTC m=+1.844245789,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.749917 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e7b33b83b0c79 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:45.294980217 +0000 UTC m=+1.844637894,LastTimestamp:2026-03-20 07:12:45.294980217 +0000 UTC m=+1.844637894,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.756871 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e7b33b87ca5e5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:45.299279333 +0000 UTC m=+1.848937020,LastTimestamp:2026-03-20 07:12:45.299279333 +0000 UTC m=+1.848937020,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.763259 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7b33b8d0de7b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:45.304798843 +0000 UTC m=+1.854456530,LastTimestamp:2026-03-20 07:12:45.304798843 +0000 UTC m=+1.854456530,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.769755 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7b33b8e9b9fc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:45.3064279 +0000 UTC m=+1.856085577,LastTimestamp:2026-03-20 07:12:45.3064279 +0000 UTC m=+1.856085577,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.776826 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7b33b91d4f39 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:45.309808441 +0000 UTC m=+1.859466128,LastTimestamp:2026-03-20 07:12:45.309808441 +0000 UTC m=+1.859466128,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.784275 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e7b33b933e8d2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:45.311289554 +0000 UTC m=+1.860947231,LastTimestamp:2026-03-20 07:12:45.311289554 +0000 UTC m=+1.860947231,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.791225 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7b33b93b3351 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:45.311767377 +0000 UTC m=+1.861425064,LastTimestamp:2026-03-20 07:12:45.311767377 +0000 UTC m=+1.861425064,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.798090 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e7b33b9dc8ca7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:45.322341543 +0000 UTC m=+1.871999210,LastTimestamp:2026-03-20 07:12:45.322341543 +0000 UTC m=+1.871999210,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.805269 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7b33ba523846 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:45.33005319 +0000 UTC m=+1.879710867,LastTimestamp:2026-03-20 07:12:45.33005319 +0000 UTC m=+1.879710867,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.811891 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7b33ba88e3d4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:45.333636052 +0000 UTC m=+1.883293719,LastTimestamp:2026-03-20 07:12:45.333636052 +0000 UTC m=+1.883293719,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.819581 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7b33ca17eae9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:45.594667753 +0000 UTC m=+2.144325400,LastTimestamp:2026-03-20 07:12:45.594667753 +0000 UTC m=+2.144325400,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.826261 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7b33caddc71c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:45.607634716 +0000 UTC m=+2.157292393,LastTimestamp:2026-03-20 07:12:45.607634716 +0000 UTC m=+2.157292393,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: I0320 07:13:08.826694 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:13:08 crc kubenswrapper[4749]: I0320 07:13:08.826947 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:08 crc kubenswrapper[4749]: I0320 07:13:08.828363 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:08 crc kubenswrapper[4749]: I0320 07:13:08.828565 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:08 crc kubenswrapper[4749]: I0320 07:13:08.828750 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:08 crc kubenswrapper[4749]: I0320 07:13:08.829765 4749 scope.go:117] "RemoveContainer" containerID="c9eecc22366973889992af3b26b7400e6175eba272ddedd64b11d42a16d538af" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.830220 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.841835 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7b33caefc97f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:45.608814975 +0000 UTC m=+2.158472652,LastTimestamp:2026-03-20 07:12:45.608814975 +0000 UTC m=+2.158472652,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.864153 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7b33d909fe66 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:45.845413478 +0000 UTC m=+2.395071155,LastTimestamp:2026-03-20 07:12:45.845413478 +0000 UTC m=+2.395071155,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.870240 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7b33d9bffe24 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:45.857340964 +0000 UTC m=+2.406998641,LastTimestamp:2026-03-20 07:12:45.857340964 +0000 UTC m=+2.406998641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.875508 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7b33d9d307db openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:45.858588635 +0000 UTC m=+2.408246312,LastTimestamp:2026-03-20 07:12:45.858588635 +0000 UTC m=+2.408246312,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.878706 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7b33e8fbf308 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.11292852 +0000 UTC m=+2.662586207,LastTimestamp:2026-03-20 07:12:46.11292852 +0000 UTC m=+2.662586207,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.882529 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7b33e9c16d92 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.125870482 +0000 UTC m=+2.675528149,LastTimestamp:2026-03-20 07:12:46.125870482 +0000 UTC m=+2.675528149,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.886967 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e7b33ee2ab733 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.199879475 +0000 UTC m=+2.749537132,LastTimestamp:2026-03-20 07:12:46.199879475 +0000 UTC m=+2.749537132,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.890758 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7b33ee339b4f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.200462159 +0000 UTC m=+2.750119836,LastTimestamp:2026-03-20 07:12:46.200462159 +0000 UTC m=+2.750119836,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.895165 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7b33eecc0885 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.210451589 +0000 UTC m=+2.760109246,LastTimestamp:2026-03-20 07:12:46.210451589 +0000 UTC m=+2.760109246,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.899195 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e7b33ef9ef344 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.224274244 +0000 UTC m=+2.773931931,LastTimestamp:2026-03-20 07:12:46.224274244 +0000 UTC m=+2.773931931,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.903533 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e7b33ff740607 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.489896455 +0000 UTC m=+3.039554102,LastTimestamp:2026-03-20 07:12:46.489896455 +0000 UTC m=+3.039554102,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.908087 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7b33ff9ad81b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.492440603 +0000 UTC m=+3.042098250,LastTimestamp:2026-03-20 07:12:46.492440603 +0000 UTC m=+3.042098250,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.915725 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7b33fff68175 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.498447733 +0000 UTC m=+3.048105380,LastTimestamp:2026-03-20 07:12:46.498447733 +0000 UTC m=+3.048105380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.923533 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e7b3400096546 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.499685702 +0000 UTC m=+3.049343349,LastTimestamp:2026-03-20 07:12:46.499685702 +0000 UTC m=+3.049343349,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.928096 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e7b3400504e84 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.504332932 +0000 UTC m=+3.053990619,LastTimestamp:2026-03-20 07:12:46.504332932 +0000 UTC m=+3.053990619,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.933168 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7b34005a175b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.504974171 +0000 UTC m=+3.054631818,LastTimestamp:2026-03-20 07:12:46.504974171 +0000 UTC m=+3.054631818,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.938174 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e7b34005dc00c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.505213964 +0000 UTC m=+3.054871651,LastTimestamp:2026-03-20 07:12:46.505213964 +0000 UTC m=+3.054871651,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.942837 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7b340069c323 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.506001187 +0000 UTC m=+3.055658834,LastTimestamp:2026-03-20 07:12:46.506001187 +0000 UTC m=+3.055658834,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.948862 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e7b34019e0625 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.526203429 +0000 UTC m=+3.075861076,LastTimestamp:2026-03-20 07:12:46.526203429 +0000 UTC m=+3.075861076,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.954418 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7b34023c41d3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.536573395 +0000 UTC m=+3.086231042,LastTimestamp:2026-03-20 07:12:46.536573395 +0000 UTC m=+3.086231042,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.960351 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e7b340c74494a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.708017482 +0000 UTC m=+3.257675129,LastTimestamp:2026-03-20 07:12:46.708017482 +0000 UTC m=+3.257675129,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.965917 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7b340c8cd284 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.709625476 +0000 UTC m=+3.259283143,LastTimestamp:2026-03-20 07:12:46.709625476 +0000 UTC m=+3.259283143,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.976163 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e7b340d14eda8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.71854532 +0000 UTC m=+3.268202977,LastTimestamp:2026-03-20 07:12:46.71854532 +0000 UTC m=+3.268202977,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.980409 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e7b340d2cc6bf openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.720108223 +0000 UTC m=+3.269765880,LastTimestamp:2026-03-20 07:12:46.720108223 +0000 UTC m=+3.269765880,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.984992 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7b340dee5f73 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.732795763 +0000 UTC m=+3.282453410,LastTimestamp:2026-03-20 07:12:46.732795763 +0000 UTC m=+3.282453410,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.990077 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7b340e177719 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.735488793 +0000 UTC m=+3.285146450,LastTimestamp:2026-03-20 07:12:46.735488793 +0000 UTC m=+3.285146450,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:08 crc kubenswrapper[4749]: E0320 07:13:08.995182 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e7b3419153c92 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.919892114 +0000 UTC m=+3.469549761,LastTimestamp:2026-03-20 07:12:46.919892114 +0000 UTC m=+3.469549761,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.002856 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7b34195dbe42 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.924643906 +0000 UTC m=+3.474301553,LastTimestamp:2026-03-20 07:12:46.924643906 +0000 UTC m=+3.474301553,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.008353 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e7b3419d5d7c1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.932514753 +0000 UTC m=+3.482172400,LastTimestamp:2026-03-20 07:12:46.932514753 +0000 UTC m=+3.482172400,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.013877 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7b341a7f8714 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.94363522 +0000 UTC m=+3.493292867,LastTimestamp:2026-03-20 07:12:46.94363522 +0000 UTC m=+3.493292867,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.020519 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7b341a9d0e87 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:46.945570439 +0000 UTC m=+3.495228086,LastTimestamp:2026-03-20 07:12:46.945570439 +0000 UTC m=+3.495228086,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.024753 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7b342676a635 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:47.144379957 +0000 UTC m=+3.694037604,LastTimestamp:2026-03-20 07:12:47.144379957 +0000 UTC m=+3.694037604,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.028689 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7b34272b7cd7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:47.156231383 +0000 UTC m=+3.705889030,LastTimestamp:2026-03-20 07:12:47.156231383 +0000 UTC m=+3.705889030,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.033260 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7b3427380d1a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:47.157054746 +0000 UTC m=+3.706712393,LastTimestamp:2026-03-20 07:12:47.157054746 +0000 UTC m=+3.706712393,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.038460 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7b342ba6a2b6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:47.23141087 +0000 UTC m=+3.781068517,LastTimestamp:2026-03-20 07:12:47.23141087 +0000 UTC m=+3.781068517,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.043025 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7b3434a2eec0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:47.382163136 +0000 UTC m=+3.931820833,LastTimestamp:2026-03-20 07:12:47.382163136 +0000 UTC m=+3.931820833,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.047761 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7b34358dc4d7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:47.397553367 +0000 UTC m=+3.947211054,LastTimestamp:2026-03-20 07:12:47.397553367 +0000 UTC m=+3.947211054,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.052218 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7b3438d7e4d0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:47.452742864 +0000 UTC m=+4.002400521,LastTimestamp:2026-03-20 07:12:47.452742864 +0000 UTC m=+4.002400521,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.056457 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7b3439ac338c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:47.466656652 +0000 UTC m=+4.016314329,LastTimestamp:2026-03-20 07:12:47.466656652 +0000 UTC m=+4.016314329,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.061833 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7b3467e5b10a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:48.242176266 +0000 UTC m=+4.791833953,LastTimestamp:2026-03-20 07:12:48.242176266 +0000 UTC m=+4.791833953,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.065908 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7b3476f4a8a7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:48.494815399 +0000 UTC m=+5.044473086,LastTimestamp:2026-03-20 07:12:48.494815399 +0000 UTC m=+5.044473086,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.070083 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7b3477bb1a8b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:48.507820683 +0000 UTC m=+5.057478360,LastTimestamp:2026-03-20 07:12:48.507820683 +0000 UTC m=+5.057478360,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.074859 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7b3477cf9c63 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:48.509164643 +0000 UTC m=+5.058822320,LastTimestamp:2026-03-20 07:12:48.509164643 +0000 UTC m=+5.058822320,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.081052 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7b348616903a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:48.74869561 +0000 UTC m=+5.298353297,LastTimestamp:2026-03-20 07:12:48.74869561 +0000 UTC m=+5.298353297,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.085366 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7b34872b0d68 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:48.766815592 +0000 UTC m=+5.316473269,LastTimestamp:2026-03-20 07:12:48.766815592 +0000 UTC m=+5.316473269,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.091315 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7b34873e75d7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:48.768087511 +0000 UTC m=+5.317745198,LastTimestamp:2026-03-20 07:12:48.768087511 +0000 UTC m=+5.317745198,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.096956 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7b3493642ceb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:48.971885803 +0000 UTC m=+5.521543450,LastTimestamp:2026-03-20 07:12:48.971885803 +0000 UTC m=+5.521543450,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.101672 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7b34946d18ff openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:48.989247743 +0000 UTC m=+5.538905390,LastTimestamp:2026-03-20 07:12:48.989247743 +0000 UTC m=+5.538905390,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: I0320 07:13:09.107302 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.107616 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7b3494785a9a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:48.989985434 +0000 UTC m=+5.539643081,LastTimestamp:2026-03-20 07:12:48.989985434 +0000 UTC m=+5.539643081,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.114417 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7b34a44fce8b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:49.255763595 +0000 UTC m=+5.805421282,LastTimestamp:2026-03-20 07:12:49.255763595 +0000 UTC m=+5.805421282,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.120360 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7b34a4f3d959 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:49.266514265 +0000 UTC m=+5.816171942,LastTimestamp:2026-03-20 07:12:49.266514265 +0000 UTC m=+5.816171942,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.124671 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7b34a50814d7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:49.267840215 +0000 UTC m=+5.817497892,LastTimestamp:2026-03-20 07:12:49.267840215 +0000 UTC m=+5.817497892,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.130628 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7b34b49314bb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:49.528607931 +0000 UTC m=+6.078265578,LastTimestamp:2026-03-20 07:12:49.528607931 +0000 UTC m=+6.078265578,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.136530 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7b34b59e5498 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:49.546122392 +0000 UTC m=+6.095780039,LastTimestamp:2026-03-20 07:12:49.546122392 +0000 UTC m=+6.095780039,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.144552 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 07:13:09 crc kubenswrapper[4749]: &Event{ObjectMeta:{kube-controller-manager-crc.189e7b353cbf7c48 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 07:13:09 crc kubenswrapper[4749]: body: Mar 20 07:13:09 crc kubenswrapper[4749]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:51.8132194 +0000 UTC m=+8.362877077,LastTimestamp:2026-03-20 07:12:51.8132194 +0000 UTC m=+8.362877077,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 07:13:09 crc kubenswrapper[4749]: > Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.152378 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7b353cc160a1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:51.813343393 +0000 UTC m=+8.363001080,LastTimestamp:2026-03-20 07:12:51.813343393 +0000 UTC m=+8.363001080,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.158861 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 07:13:09 crc kubenswrapper[4749]: &Event{ObjectMeta:{kube-apiserver-crc.189e7b36d9895f9a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 07:13:09 crc kubenswrapper[4749]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 07:13:09 crc kubenswrapper[4749]: Mar 20 07:13:09 crc kubenswrapper[4749]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:58.738663322 +0000 UTC m=+15.288320979,LastTimestamp:2026-03-20 07:12:58.738663322 +0000 UTC m=+15.288320979,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 07:13:09 crc kubenswrapper[4749]: > Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.168268 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7b36d98a299a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:58.738715034 +0000 UTC m=+15.288372691,LastTimestamp:2026-03-20 07:12:58.738715034 +0000 UTC m=+15.288372691,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.174474 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e7b36d9895f9a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 07:13:09 crc kubenswrapper[4749]: &Event{ObjectMeta:{kube-apiserver-crc.189e7b36d9895f9a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 07:13:09 crc kubenswrapper[4749]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 07:13:09 crc kubenswrapper[4749]: Mar 20 07:13:09 crc kubenswrapper[4749]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:58.738663322 +0000 UTC m=+15.288320979,LastTimestamp:2026-03-20 07:12:58.743938577 +0000 UTC m=+15.293596244,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 07:13:09 crc kubenswrapper[4749]: > Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.180819 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e7b36d98a299a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7b36d98a299a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:58.738715034 +0000 UTC m=+15.288372691,LastTimestamp:2026-03-20 07:12:58.743991819 +0000 UTC m=+15.293649476,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.187117 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 07:13:09 crc kubenswrapper[4749]: &Event{ObjectMeta:{kube-apiserver-crc.189e7b36dedafe4e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 20 07:13:09 crc kubenswrapper[4749]: body: Mar 20 07:13:09 crc kubenswrapper[4749]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:58.827898446 +0000 UTC m=+15.377556093,LastTimestamp:2026-03-20 07:12:58.827898446 +0000 UTC m=+15.377556093,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 07:13:09 crc kubenswrapper[4749]: > Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.192977 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7b36dedbbaab openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:58.827946667 +0000 UTC m=+15.377604314,LastTimestamp:2026-03-20 07:12:58.827946667 +0000 UTC m=+15.377604314,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.199663 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e7b3427380d1a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7b3427380d1a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:47.157054746 +0000 UTC m=+3.706712393,LastTimestamp:2026-03-20 07:12:59.291414934 +0000 UTC m=+15.841072591,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.207259 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 07:13:09 crc kubenswrapper[4749]: &Event{ObjectMeta:{kube-controller-manager-crc.189e7b3790daaccc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 07:13:09 crc kubenswrapper[4749]: body: Mar 20 07:13:09 crc kubenswrapper[4749]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:13:01.814222028 +0000 UTC m=+18.363879705,LastTimestamp:2026-03-20 07:13:01.814222028 +0000 UTC m=+18.363879705,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 07:13:09 crc kubenswrapper[4749]: > Mar 20 07:13:09 crc kubenswrapper[4749]: E0320 07:13:09.213166 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7b3790dc3f30 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:13:01.81432504 +0000 UTC m=+18.363982717,LastTimestamp:2026-03-20 07:13:01.81432504 +0000 UTC m=+18.363982717,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:10 crc kubenswrapper[4749]: I0320 07:13:10.109542 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:11 crc kubenswrapper[4749]: I0320 07:13:11.110024 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:11 crc kubenswrapper[4749]: W0320 07:13:11.277190 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 07:13:11 crc kubenswrapper[4749]: E0320 07:13:11.277331 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 07:13:11 crc kubenswrapper[4749]: W0320 07:13:11.612850 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 07:13:11 crc kubenswrapper[4749]: E0320 07:13:11.612938 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 07:13:11 crc kubenswrapper[4749]: I0320 07:13:11.814571 4749 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 07:13:11 crc kubenswrapper[4749]: I0320 07:13:11.814706 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 07:13:11 crc kubenswrapper[4749]: I0320 07:13:11.814803 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 07:13:11 crc kubenswrapper[4749]: I0320 07:13:11.815090 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:11 crc kubenswrapper[4749]: I0320 07:13:11.816848 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:11 crc kubenswrapper[4749]: I0320 07:13:11.816908 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:11 crc kubenswrapper[4749]: I0320 07:13:11.816932 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:11 crc kubenswrapper[4749]: I0320 07:13:11.817699 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"2bf150976cb265b706cccb6e625a9a0a06d47f2dbd69d032957551966e691e43"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 07:13:11 crc kubenswrapper[4749]: I0320 07:13:11.817958 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://2bf150976cb265b706cccb6e625a9a0a06d47f2dbd69d032957551966e691e43" gracePeriod=30 Mar 20 07:13:11 crc kubenswrapper[4749]: E0320 07:13:11.822788 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e7b3790daaccc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 07:13:11 crc kubenswrapper[4749]: &Event{ObjectMeta:{kube-controller-manager-crc.189e7b3790daaccc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 07:13:11 crc kubenswrapper[4749]: body: Mar 20 07:13:11 crc kubenswrapper[4749]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:13:01.814222028 +0000 UTC m=+18.363879705,LastTimestamp:2026-03-20 07:13:11.814660915 +0000 UTC m=+28.364318602,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 07:13:11 crc kubenswrapper[4749]: > Mar 20 07:13:11 crc kubenswrapper[4749]: E0320 07:13:11.829577 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e7b3790dc3f30\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7b3790dc3f30 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:13:01.81432504 +0000 UTC m=+18.363982717,LastTimestamp:2026-03-20 07:13:11.814754737 +0000 UTC m=+28.364412424,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:11 crc kubenswrapper[4749]: E0320 07:13:11.836147 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7b39e51f226a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:13:11.817929322 +0000 UTC m=+28.367586999,LastTimestamp:2026-03-20 07:13:11.817929322 +0000 UTC m=+28.367586999,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:11 crc kubenswrapper[4749]: E0320 07:13:11.944908 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e7b33b93b3351\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7b33b93b3351 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:45.311767377 +0000 UTC m=+1.861425064,LastTimestamp:2026-03-20 07:13:11.938966288 +0000 UTC m=+28.488623965,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:12 crc kubenswrapper[4749]: I0320 07:13:12.109331 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:12 crc kubenswrapper[4749]: I0320 07:13:12.138002 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:12 crc kubenswrapper[4749]: I0320 07:13:12.139419 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:12 crc kubenswrapper[4749]: I0320 07:13:12.139456 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:12 crc kubenswrapper[4749]: I0320 07:13:12.139468 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:12 crc kubenswrapper[4749]: I0320 07:13:12.139493 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 07:13:12 crc kubenswrapper[4749]: E0320 07:13:12.141883 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 07:13:12 crc kubenswrapper[4749]: E0320 07:13:12.142384 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 07:13:12 crc kubenswrapper[4749]: E0320 07:13:12.164361 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e7b33ca17eae9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7b33ca17eae9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:45.594667753 +0000 UTC m=+2.144325400,LastTimestamp:2026-03-20 07:13:12.162937063 +0000 UTC m=+28.712594720,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:12 crc kubenswrapper[4749]: E0320 07:13:12.182686 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e7b33caddc71c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7b33caddc71c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:12:45.607634716 +0000 UTC m=+2.157292393,LastTimestamp:2026-03-20 07:13:12.175088671 +0000 UTC m=+28.724746328,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:12 crc kubenswrapper[4749]: W0320 07:13:12.242250 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 07:13:12 crc kubenswrapper[4749]: E0320 07:13:12.242360 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 07:13:12 crc kubenswrapper[4749]: I0320 07:13:12.341844 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 07:13:12 crc kubenswrapper[4749]: I0320 07:13:12.342397 4749 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="2bf150976cb265b706cccb6e625a9a0a06d47f2dbd69d032957551966e691e43" exitCode=255 Mar 20 07:13:12 crc kubenswrapper[4749]: I0320 07:13:12.342481 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"2bf150976cb265b706cccb6e625a9a0a06d47f2dbd69d032957551966e691e43"} Mar 20 07:13:12 crc kubenswrapper[4749]: I0320 07:13:12.342527 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a65618f22b785dac8d848207a38f7fb6d287be2ab31a4eac9731a364dd487702"} Mar 20 07:13:12 crc kubenswrapper[4749]: I0320 07:13:12.342658 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:12 crc kubenswrapper[4749]: I0320 07:13:12.343777 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:12 crc kubenswrapper[4749]: I0320 07:13:12.343815 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:12 crc kubenswrapper[4749]: I0320 07:13:12.343827 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:13 crc kubenswrapper[4749]: I0320 07:13:13.108739 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:13 crc kubenswrapper[4749]: I0320 07:13:13.200720 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 07:13:13 crc kubenswrapper[4749]: I0320 07:13:13.345436 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:13 crc kubenswrapper[4749]: I0320 07:13:13.346718 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:13 crc kubenswrapper[4749]: I0320 07:13:13.346780 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:13 crc kubenswrapper[4749]: I0320 07:13:13.346804 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:14 crc kubenswrapper[4749]: I0320 07:13:14.110067 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:14 crc kubenswrapper[4749]: E0320 07:13:14.267522 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 07:13:15 crc kubenswrapper[4749]: I0320 07:13:15.109735 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:16 crc kubenswrapper[4749]: I0320 07:13:16.108713 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:17 crc kubenswrapper[4749]: I0320 07:13:17.109069 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:18 crc kubenswrapper[4749]: I0320 07:13:18.107761 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:18 crc kubenswrapper[4749]: I0320 07:13:18.813497 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 07:13:18 crc kubenswrapper[4749]: I0320 07:13:18.813733 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:18 crc kubenswrapper[4749]: I0320 07:13:18.815208 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:18 crc kubenswrapper[4749]: I0320 07:13:18.815277 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:18 crc kubenswrapper[4749]: I0320 07:13:18.815332 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:19 crc kubenswrapper[4749]: I0320 07:13:19.108647 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:19 crc kubenswrapper[4749]: I0320 07:13:19.143068 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:19 crc kubenswrapper[4749]: I0320 07:13:19.144870 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:19 crc kubenswrapper[4749]: I0320 07:13:19.144925 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:19 crc kubenswrapper[4749]: I0320 07:13:19.144943 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:19 crc kubenswrapper[4749]: I0320 07:13:19.145005 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 07:13:19 crc kubenswrapper[4749]: E0320 07:13:19.148557 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 07:13:19 crc kubenswrapper[4749]: E0320 07:13:19.148912 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 07:13:20 crc kubenswrapper[4749]: I0320 07:13:20.109530 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:21 crc kubenswrapper[4749]: I0320 07:13:21.110264 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:21 crc kubenswrapper[4749]: I0320 07:13:21.814549 4749 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 07:13:21 crc kubenswrapper[4749]: I0320 07:13:21.814654 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 07:13:21 crc kubenswrapper[4749]: E0320 07:13:21.821999 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e7b3790daaccc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 07:13:21 crc kubenswrapper[4749]: &Event{ObjectMeta:{kube-controller-manager-crc.189e7b3790daaccc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 07:13:21 crc kubenswrapper[4749]: body: Mar 20 07:13:21 crc kubenswrapper[4749]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:13:01.814222028 +0000 UTC m=+18.363879705,LastTimestamp:2026-03-20 07:13:21.814624395 +0000 UTC m=+38.364282082,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 07:13:21 crc kubenswrapper[4749]: > Mar 20 07:13:21 crc kubenswrapper[4749]: E0320 07:13:21.828847 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e7b3790dc3f30\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7b3790dc3f30 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:13:01.81432504 +0000 UTC m=+18.363982717,LastTimestamp:2026-03-20 07:13:21.814707547 +0000 UTC m=+38.364365224,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:13:22 crc kubenswrapper[4749]: I0320 07:13:22.110755 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:22 crc kubenswrapper[4749]: I0320 07:13:22.177466 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:22 crc kubenswrapper[4749]: I0320 07:13:22.179351 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:22 crc kubenswrapper[4749]: I0320 07:13:22.179456 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:22 crc kubenswrapper[4749]: I0320 07:13:22.179493 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:22 crc kubenswrapper[4749]: I0320 07:13:22.180594 4749 scope.go:117] "RemoveContainer" containerID="c9eecc22366973889992af3b26b7400e6175eba272ddedd64b11d42a16d538af" Mar 20 07:13:23 crc kubenswrapper[4749]: I0320 07:13:23.109019 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:23 crc kubenswrapper[4749]: I0320 07:13:23.376154 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 07:13:23 crc kubenswrapper[4749]: I0320 07:13:23.380003 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a8d83b7d66918c0199a4fa1e65d2480ad645376f1dc2323a37f13c7f2e9db786"} Mar 20 07:13:23 crc kubenswrapper[4749]: I0320 07:13:23.380414 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:23 crc kubenswrapper[4749]: I0320 07:13:23.381958 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:23 crc kubenswrapper[4749]: I0320 07:13:23.382054 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:23 crc kubenswrapper[4749]: I0320 07:13:23.382077 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:24 crc kubenswrapper[4749]: I0320 07:13:24.112854 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:24 crc kubenswrapper[4749]: E0320 07:13:24.268605 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 07:13:24 crc kubenswrapper[4749]: I0320 07:13:24.385586 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 07:13:24 crc kubenswrapper[4749]: I0320 07:13:24.386192 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 07:13:24 crc kubenswrapper[4749]: I0320 07:13:24.388698 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a8d83b7d66918c0199a4fa1e65d2480ad645376f1dc2323a37f13c7f2e9db786" exitCode=255 Mar 20 07:13:24 crc kubenswrapper[4749]: I0320 07:13:24.388757 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a8d83b7d66918c0199a4fa1e65d2480ad645376f1dc2323a37f13c7f2e9db786"} Mar 20 07:13:24 crc kubenswrapper[4749]: I0320 07:13:24.388810 4749 scope.go:117] "RemoveContainer" containerID="c9eecc22366973889992af3b26b7400e6175eba272ddedd64b11d42a16d538af" Mar 20 07:13:24 crc kubenswrapper[4749]: I0320 07:13:24.389055 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:24 crc kubenswrapper[4749]: I0320 07:13:24.390330 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:24 crc kubenswrapper[4749]: I0320 07:13:24.390401 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:24 crc kubenswrapper[4749]: I0320 07:13:24.390421 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:24 crc kubenswrapper[4749]: I0320 07:13:24.391263 4749 scope.go:117] "RemoveContainer" containerID="a8d83b7d66918c0199a4fa1e65d2480ad645376f1dc2323a37f13c7f2e9db786" Mar 20 07:13:24 crc kubenswrapper[4749]: E0320 07:13:24.391584 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 07:13:24 crc kubenswrapper[4749]: W0320 07:13:24.716930 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 07:13:24 crc kubenswrapper[4749]: E0320 07:13:24.717021 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 07:13:25 crc kubenswrapper[4749]: I0320 07:13:25.109011 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:25 crc kubenswrapper[4749]: W0320 07:13:25.137903 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 07:13:25 crc kubenswrapper[4749]: E0320 07:13:25.137977 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 07:13:25 crc kubenswrapper[4749]: I0320 07:13:25.393408 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 07:13:26 crc kubenswrapper[4749]: I0320 07:13:26.110781 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:26 crc kubenswrapper[4749]: I0320 07:13:26.149022 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:26 crc kubenswrapper[4749]: I0320 07:13:26.156429 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:26 crc kubenswrapper[4749]: I0320 07:13:26.156495 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:26 crc kubenswrapper[4749]: I0320 07:13:26.156512 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:26 crc kubenswrapper[4749]: I0320 07:13:26.156550 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 07:13:26 crc kubenswrapper[4749]: E0320 07:13:26.160864 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 07:13:26 crc kubenswrapper[4749]: E0320 07:13:26.160963 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 07:13:27 crc kubenswrapper[4749]: I0320 07:13:27.107911 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:27 crc kubenswrapper[4749]: W0320 07:13:27.175080 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:27 crc kubenswrapper[4749]: E0320 07:13:27.175144 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 07:13:28 crc kubenswrapper[4749]: I0320 07:13:28.109864 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:28 crc kubenswrapper[4749]: I0320 07:13:28.826943 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:13:28 crc kubenswrapper[4749]: I0320 07:13:28.827191 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:28 crc kubenswrapper[4749]: I0320 07:13:28.828855 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:28 crc kubenswrapper[4749]: I0320 07:13:28.828919 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:28 crc kubenswrapper[4749]: I0320 07:13:28.828946 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:28 crc kubenswrapper[4749]: I0320 07:13:28.829860 4749 scope.go:117] "RemoveContainer" containerID="a8d83b7d66918c0199a4fa1e65d2480ad645376f1dc2323a37f13c7f2e9db786" Mar 20 07:13:28 crc kubenswrapper[4749]: E0320 07:13:28.830237 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 07:13:29 crc kubenswrapper[4749]: I0320 07:13:29.109411 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:29 crc kubenswrapper[4749]: W0320 07:13:29.527618 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 07:13:29 crc kubenswrapper[4749]: E0320 07:13:29.527702 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 07:13:30 crc kubenswrapper[4749]: I0320 07:13:30.109831 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:31 crc kubenswrapper[4749]: I0320 07:13:31.109638 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:31 crc kubenswrapper[4749]: I0320 07:13:31.768000 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:13:31 crc kubenswrapper[4749]: I0320 07:13:31.768277 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:31 crc kubenswrapper[4749]: I0320 07:13:31.770442 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:31 crc kubenswrapper[4749]: I0320 07:13:31.770526 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:31 crc kubenswrapper[4749]: I0320 07:13:31.770550 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:31 crc kubenswrapper[4749]: I0320 07:13:31.771692 4749 scope.go:117] "RemoveContainer" containerID="a8d83b7d66918c0199a4fa1e65d2480ad645376f1dc2323a37f13c7f2e9db786" Mar 20 07:13:31 crc kubenswrapper[4749]: E0320 07:13:31.772081 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 07:13:31 crc kubenswrapper[4749]: I0320 07:13:31.813596 4749 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 07:13:31 crc kubenswrapper[4749]: I0320 07:13:31.813683 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 07:13:31 crc kubenswrapper[4749]: E0320 07:13:31.822136 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e7b3790daaccc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 07:13:31 crc kubenswrapper[4749]: &Event{ObjectMeta:{kube-controller-manager-crc.189e7b3790daaccc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 07:13:31 crc kubenswrapper[4749]: body: Mar 20 07:13:31 crc kubenswrapper[4749]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:13:01.814222028 +0000 UTC m=+18.363879705,LastTimestamp:2026-03-20 07:13:31.813660555 +0000 UTC m=+48.363318242,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 07:13:31 crc kubenswrapper[4749]: > Mar 20 07:13:32 crc kubenswrapper[4749]: I0320 07:13:32.110402 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:33 crc kubenswrapper[4749]: I0320 07:13:33.109373 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:33 crc kubenswrapper[4749]: I0320 07:13:33.161096 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:33 crc kubenswrapper[4749]: I0320 07:13:33.162957 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:33 crc kubenswrapper[4749]: I0320 07:13:33.163050 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:33 crc kubenswrapper[4749]: I0320 07:13:33.163078 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:33 crc kubenswrapper[4749]: I0320 07:13:33.163129 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 07:13:33 crc kubenswrapper[4749]: E0320 07:13:33.168453 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 07:13:33 crc kubenswrapper[4749]: E0320 07:13:33.168921 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 07:13:34 crc kubenswrapper[4749]: I0320 07:13:34.109691 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:34 crc kubenswrapper[4749]: E0320 07:13:34.268979 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 07:13:35 crc kubenswrapper[4749]: I0320 07:13:35.109990 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:36 crc kubenswrapper[4749]: I0320 07:13:36.087785 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 07:13:36 crc kubenswrapper[4749]: I0320 07:13:36.088021 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:36 crc kubenswrapper[4749]: I0320 07:13:36.089509 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:36 crc kubenswrapper[4749]: I0320 07:13:36.089560 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:36 crc kubenswrapper[4749]: I0320 07:13:36.089579 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:36 crc kubenswrapper[4749]: I0320 07:13:36.109400 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:37 crc kubenswrapper[4749]: I0320 07:13:37.109637 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:38 crc kubenswrapper[4749]: I0320 07:13:38.108588 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:38 crc kubenswrapper[4749]: I0320 07:13:38.819920 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 07:13:38 crc kubenswrapper[4749]: I0320 07:13:38.820129 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:38 crc kubenswrapper[4749]: I0320 07:13:38.821959 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:38 crc kubenswrapper[4749]: I0320 07:13:38.822018 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:38 crc kubenswrapper[4749]: I0320 07:13:38.822039 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:38 crc kubenswrapper[4749]: I0320 07:13:38.826168 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 07:13:39 crc kubenswrapper[4749]: I0320 07:13:39.110146 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:39 crc kubenswrapper[4749]: I0320 07:13:39.435028 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:39 crc kubenswrapper[4749]: I0320 07:13:39.436139 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:39 crc kubenswrapper[4749]: I0320 07:13:39.436204 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:39 crc kubenswrapper[4749]: I0320 07:13:39.436222 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:40 crc kubenswrapper[4749]: I0320 07:13:40.109218 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:40 crc kubenswrapper[4749]: I0320 07:13:40.169616 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:40 crc kubenswrapper[4749]: I0320 07:13:40.171143 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:40 crc kubenswrapper[4749]: I0320 07:13:40.171181 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:40 crc kubenswrapper[4749]: I0320 07:13:40.171195 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:40 crc kubenswrapper[4749]: I0320 07:13:40.171218 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 07:13:40 crc kubenswrapper[4749]: E0320 07:13:40.176675 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 07:13:40 crc kubenswrapper[4749]: E0320 07:13:40.176871 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 07:13:41 crc kubenswrapper[4749]: I0320 07:13:41.108045 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:42 crc kubenswrapper[4749]: I0320 07:13:42.106201 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:43 crc kubenswrapper[4749]: I0320 07:13:43.109792 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:44 crc kubenswrapper[4749]: I0320 07:13:44.107379 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:44 crc kubenswrapper[4749]: E0320 07:13:44.269442 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 07:13:45 crc kubenswrapper[4749]: I0320 07:13:45.107157 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:46 crc kubenswrapper[4749]: I0320 07:13:46.110096 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:46 crc kubenswrapper[4749]: I0320 07:13:46.177374 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:46 crc kubenswrapper[4749]: I0320 07:13:46.179010 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:46 crc kubenswrapper[4749]: I0320 07:13:46.179063 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:46 crc kubenswrapper[4749]: I0320 07:13:46.179079 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:46 crc kubenswrapper[4749]: I0320 07:13:46.179879 4749 scope.go:117] "RemoveContainer" containerID="a8d83b7d66918c0199a4fa1e65d2480ad645376f1dc2323a37f13c7f2e9db786" Mar 20 07:13:46 crc kubenswrapper[4749]: I0320 07:13:46.479260 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 07:13:46 crc kubenswrapper[4749]: I0320 07:13:46.487422 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8"} Mar 20 07:13:46 crc kubenswrapper[4749]: I0320 07:13:46.487737 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:46 crc kubenswrapper[4749]: I0320 07:13:46.488628 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:46 crc kubenswrapper[4749]: I0320 07:13:46.488690 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:46 crc kubenswrapper[4749]: I0320 07:13:46.488709 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:47 crc kubenswrapper[4749]: I0320 07:13:47.109669 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:47 crc kubenswrapper[4749]: I0320 07:13:47.177815 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:47 crc kubenswrapper[4749]: I0320 07:13:47.179508 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:47 crc kubenswrapper[4749]: I0320 07:13:47.179579 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:47 crc kubenswrapper[4749]: I0320 07:13:47.179603 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:47 crc kubenswrapper[4749]: I0320 07:13:47.179952 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 07:13:47 crc kubenswrapper[4749]: E0320 07:13:47.189594 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 07:13:47 crc kubenswrapper[4749]: E0320 07:13:47.189703 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 07:13:47 crc kubenswrapper[4749]: I0320 07:13:47.493432 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 07:13:47 crc kubenswrapper[4749]: I0320 07:13:47.494605 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 07:13:47 crc kubenswrapper[4749]: I0320 07:13:47.497719 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8" exitCode=255 Mar 20 07:13:47 crc kubenswrapper[4749]: I0320 07:13:47.497775 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8"} Mar 20 07:13:47 crc kubenswrapper[4749]: I0320 07:13:47.497832 4749 scope.go:117] "RemoveContainer" containerID="a8d83b7d66918c0199a4fa1e65d2480ad645376f1dc2323a37f13c7f2e9db786" Mar 20 07:13:47 crc kubenswrapper[4749]: I0320 07:13:47.498003 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:47 crc kubenswrapper[4749]: I0320 07:13:47.499394 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:47 crc kubenswrapper[4749]: I0320 07:13:47.499461 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:47 crc kubenswrapper[4749]: I0320 07:13:47.499484 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:47 crc kubenswrapper[4749]: I0320 07:13:47.500470 4749 scope.go:117] "RemoveContainer" containerID="f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8" Mar 20 07:13:47 crc kubenswrapper[4749]: E0320 07:13:47.500819 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 07:13:48 crc kubenswrapper[4749]: I0320 07:13:48.109535 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 07:13:48 crc kubenswrapper[4749]: I0320 07:13:48.501950 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 07:13:48 crc kubenswrapper[4749]: I0320 07:13:48.630391 4749 csr.go:261] certificate signing request csr-f27dk is approved, waiting to be issued Mar 20 07:13:48 crc kubenswrapper[4749]: I0320 07:13:48.638396 4749 csr.go:257] certificate signing request csr-f27dk is issued Mar 20 07:13:48 crc kubenswrapper[4749]: I0320 07:13:48.732824 4749 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 07:13:48 crc kubenswrapper[4749]: I0320 07:13:48.827062 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:13:48 crc kubenswrapper[4749]: I0320 07:13:48.827204 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:48 crc kubenswrapper[4749]: I0320 07:13:48.828346 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:48 crc kubenswrapper[4749]: I0320 07:13:48.828445 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:48 crc kubenswrapper[4749]: I0320 07:13:48.828465 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:48 crc kubenswrapper[4749]: I0320 07:13:48.829438 4749 scope.go:117] "RemoveContainer" containerID="f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8" Mar 20 07:13:48 crc kubenswrapper[4749]: E0320 07:13:48.829713 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 07:13:48 crc kubenswrapper[4749]: I0320 07:13:48.943097 4749 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 07:13:49 crc kubenswrapper[4749]: I0320 07:13:49.640048 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-25 13:37:31.040473899 +0000 UTC Mar 20 07:13:49 crc kubenswrapper[4749]: I0320 07:13:49.640094 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6006h23m41.400383429s for next certificate rotation Mar 20 07:13:51 crc kubenswrapper[4749]: I0320 07:13:51.768234 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:13:51 crc kubenswrapper[4749]: I0320 07:13:51.768504 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:51 crc kubenswrapper[4749]: I0320 07:13:51.770039 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:51 crc kubenswrapper[4749]: I0320 07:13:51.770095 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:51 crc kubenswrapper[4749]: I0320 07:13:51.770113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:51 crc kubenswrapper[4749]: I0320 07:13:51.771112 4749 scope.go:117] "RemoveContainer" containerID="f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8" Mar 20 07:13:51 crc kubenswrapper[4749]: E0320 07:13:51.771665 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.189739 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.191164 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.191276 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.191374 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.191515 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.202033 4749 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.202318 4749 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 07:13:54 crc kubenswrapper[4749]: E0320 07:13:54.202342 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.208214 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.208261 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.208274 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.208310 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.208326 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:13:54Z","lastTransitionTime":"2026-03-20T07:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:13:54 crc kubenswrapper[4749]: E0320 07:13:54.229937 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.239064 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.239238 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.239323 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.239396 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.239467 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:13:54Z","lastTransitionTime":"2026-03-20T07:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:13:54 crc kubenswrapper[4749]: E0320 07:13:54.252537 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.261888 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.261966 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.261990 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.262021 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.262048 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:13:54Z","lastTransitionTime":"2026-03-20T07:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:13:54 crc kubenswrapper[4749]: E0320 07:13:54.270504 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 07:13:54 crc kubenswrapper[4749]: E0320 07:13:54.276961 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.285861 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.285973 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.286038 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.286101 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:13:54 crc kubenswrapper[4749]: I0320 07:13:54.286156 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:13:54Z","lastTransitionTime":"2026-03-20T07:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:13:54 crc kubenswrapper[4749]: E0320 07:13:54.301051 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:13:54 crc kubenswrapper[4749]: E0320 07:13:54.301564 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 07:13:54 crc kubenswrapper[4749]: E0320 07:13:54.301678 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:54 crc kubenswrapper[4749]: E0320 07:13:54.402556 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:54 crc kubenswrapper[4749]: E0320 07:13:54.504032 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:54 crc kubenswrapper[4749]: E0320 07:13:54.605056 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:54 crc kubenswrapper[4749]: E0320 07:13:54.705977 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:54 crc kubenswrapper[4749]: E0320 07:13:54.807099 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:54 crc kubenswrapper[4749]: E0320 07:13:54.907922 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:55 crc kubenswrapper[4749]: E0320 07:13:55.008951 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:55 crc kubenswrapper[4749]: E0320 07:13:55.109807 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:55 crc kubenswrapper[4749]: E0320 07:13:55.210817 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:55 crc kubenswrapper[4749]: E0320 07:13:55.311990 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:55 crc kubenswrapper[4749]: E0320 07:13:55.413205 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:55 crc kubenswrapper[4749]: E0320 07:13:55.514580 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:55 crc kubenswrapper[4749]: E0320 07:13:55.615721 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:55 crc kubenswrapper[4749]: E0320 07:13:55.715925 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:55 crc kubenswrapper[4749]: E0320 07:13:55.816368 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:55 crc kubenswrapper[4749]: E0320 07:13:55.916560 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:56 crc kubenswrapper[4749]: E0320 07:13:56.016965 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:56 crc kubenswrapper[4749]: E0320 07:13:56.118091 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:56 crc kubenswrapper[4749]: E0320 07:13:56.218505 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:56 crc kubenswrapper[4749]: E0320 07:13:56.319329 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:56 crc kubenswrapper[4749]: E0320 07:13:56.419775 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:56 crc kubenswrapper[4749]: E0320 07:13:56.520339 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:56 crc kubenswrapper[4749]: E0320 07:13:56.621072 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:56 crc kubenswrapper[4749]: E0320 07:13:56.721370 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:56 crc kubenswrapper[4749]: E0320 07:13:56.821961 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:56 crc kubenswrapper[4749]: E0320 07:13:56.922368 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:57 crc kubenswrapper[4749]: E0320 07:13:57.022715 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:57 crc kubenswrapper[4749]: E0320 07:13:57.123474 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:57 crc kubenswrapper[4749]: E0320 07:13:57.223839 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:57 crc kubenswrapper[4749]: E0320 07:13:57.323957 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:57 crc kubenswrapper[4749]: E0320 07:13:57.424532 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:57 crc kubenswrapper[4749]: E0320 07:13:57.524644 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:57 crc kubenswrapper[4749]: E0320 07:13:57.625029 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:57 crc kubenswrapper[4749]: E0320 07:13:57.725536 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:57 crc kubenswrapper[4749]: E0320 07:13:57.826653 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:57 crc kubenswrapper[4749]: E0320 07:13:57.927828 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:58 crc kubenswrapper[4749]: E0320 07:13:58.028204 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:58 crc kubenswrapper[4749]: E0320 07:13:58.128681 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:58 crc kubenswrapper[4749]: E0320 07:13:58.229182 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:58 crc kubenswrapper[4749]: E0320 07:13:58.329459 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:58 crc kubenswrapper[4749]: E0320 07:13:58.430575 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:58 crc kubenswrapper[4749]: E0320 07:13:58.531704 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:58 crc kubenswrapper[4749]: E0320 07:13:58.632605 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:58 crc kubenswrapper[4749]: E0320 07:13:58.733140 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:58 crc kubenswrapper[4749]: I0320 07:13:58.811917 4749 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 07:13:58 crc kubenswrapper[4749]: E0320 07:13:58.833361 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:58 crc kubenswrapper[4749]: E0320 07:13:58.934318 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:59 crc kubenswrapper[4749]: E0320 07:13:59.034676 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:59 crc kubenswrapper[4749]: E0320 07:13:59.135541 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:59 crc kubenswrapper[4749]: E0320 07:13:59.236362 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:59 crc kubenswrapper[4749]: E0320 07:13:59.337248 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:59 crc kubenswrapper[4749]: E0320 07:13:59.438266 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:59 crc kubenswrapper[4749]: E0320 07:13:59.538790 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:59 crc kubenswrapper[4749]: E0320 07:13:59.639360 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:59 crc kubenswrapper[4749]: E0320 07:13:59.740906 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:59 crc kubenswrapper[4749]: E0320 07:13:59.841853 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:13:59 crc kubenswrapper[4749]: E0320 07:13:59.942187 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:00 crc kubenswrapper[4749]: E0320 07:14:00.042320 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:00 crc kubenswrapper[4749]: E0320 07:14:00.142885 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:00 crc kubenswrapper[4749]: E0320 07:14:00.243930 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:00 crc kubenswrapper[4749]: E0320 07:14:00.344828 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:00 crc kubenswrapper[4749]: E0320 07:14:00.446033 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:00 crc kubenswrapper[4749]: E0320 07:14:00.546200 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:00 crc kubenswrapper[4749]: E0320 07:14:00.647023 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:00 crc kubenswrapper[4749]: E0320 07:14:00.748121 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:00 crc kubenswrapper[4749]: E0320 07:14:00.848859 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:00 crc kubenswrapper[4749]: E0320 07:14:00.948972 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:01 crc kubenswrapper[4749]: E0320 07:14:01.049722 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:01 crc kubenswrapper[4749]: E0320 07:14:01.150746 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:01 crc kubenswrapper[4749]: E0320 07:14:01.251880 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:01 crc kubenswrapper[4749]: E0320 07:14:01.352698 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:01 crc kubenswrapper[4749]: E0320 07:14:01.453252 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:01 crc kubenswrapper[4749]: E0320 07:14:01.553500 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:01 crc kubenswrapper[4749]: E0320 07:14:01.654013 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:01 crc kubenswrapper[4749]: E0320 07:14:01.754806 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:01 crc kubenswrapper[4749]: E0320 07:14:01.855402 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:01 crc kubenswrapper[4749]: E0320 07:14:01.956397 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:02 crc kubenswrapper[4749]: E0320 07:14:02.056958 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:02 crc kubenswrapper[4749]: E0320 07:14:02.157190 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:02 crc kubenswrapper[4749]: E0320 07:14:02.257552 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:02 crc kubenswrapper[4749]: E0320 07:14:02.358214 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:02 crc kubenswrapper[4749]: E0320 07:14:02.459155 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:02 crc kubenswrapper[4749]: E0320 07:14:02.559778 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:02 crc kubenswrapper[4749]: E0320 07:14:02.660393 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:02 crc kubenswrapper[4749]: E0320 07:14:02.761447 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:02 crc kubenswrapper[4749]: E0320 07:14:02.861593 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:02 crc kubenswrapper[4749]: E0320 07:14:02.962405 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:03 crc kubenswrapper[4749]: E0320 07:14:03.063251 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:03 crc kubenswrapper[4749]: E0320 07:14:03.163611 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:03 crc kubenswrapper[4749]: E0320 07:14:03.264314 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:03 crc kubenswrapper[4749]: E0320 07:14:03.365275 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:03 crc kubenswrapper[4749]: E0320 07:14:03.466006 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:03 crc kubenswrapper[4749]: E0320 07:14:03.566391 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:03 crc kubenswrapper[4749]: E0320 07:14:03.667530 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:03 crc kubenswrapper[4749]: E0320 07:14:03.768366 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:03 crc kubenswrapper[4749]: E0320 07:14:03.869652 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 07:14:03 crc kubenswrapper[4749]: I0320 07:14:03.890269 4749 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 07:14:03 crc kubenswrapper[4749]: I0320 07:14:03.972759 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:03 crc kubenswrapper[4749]: I0320 07:14:03.972805 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:03 crc kubenswrapper[4749]: I0320 07:14:03.972822 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:03 crc kubenswrapper[4749]: I0320 07:14:03.972845 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:03 crc kubenswrapper[4749]: I0320 07:14:03.972864 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:03Z","lastTransitionTime":"2026-03-20T07:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.076253 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.076371 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.076399 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.076431 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.076459 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:04Z","lastTransitionTime":"2026-03-20T07:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.125699 4749 apiserver.go:52] "Watching apiserver" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.131579 4749 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.132228 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-operator/iptables-alerter-4ln5h","openshift-multus/multus-rcq9v","openshift-network-node-identity/network-node-identity-vrzqb","openshift-ovn-kubernetes/ovnkube-node-tdgcw","openshift-multus/network-metrics-daemon-k56zh","openshift-image-registry/node-ca-r9vtf","openshift-machine-config-operator/machine-config-daemon-fxqfd","openshift-multus/multus-additional-cni-plugins-g4qlg","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr","openshift-dns/node-resolver-fnwpn","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.132824 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.132863 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.132989 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.133012 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.133340 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.133400 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.133566 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.133825 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.133891 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.134121 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-r9vtf" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.134327 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.134448 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.134558 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fnwpn" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.135102 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.135574 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.136556 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.136640 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.136980 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.140980 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.140994 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.141110 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.141127 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.141639 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.141723 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.141928 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.141974 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.142045 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.142074 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.142113 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.142274 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.142370 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.142393 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.142441 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.142471 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.142515 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.142583 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.142600 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.142934 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.142970 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.143715 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.144975 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.145408 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.146469 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.147275 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.147599 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.152322 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.152488 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.152626 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.152782 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.153765 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.154518 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.154727 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.154880 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.155142 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.155309 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.175784 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.179154 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.179212 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.179236 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.179271 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.179327 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:04Z","lastTransitionTime":"2026-03-20T07:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.195723 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.207139 4749 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.212086 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.231944 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.237927 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.237981 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238014 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238044 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238078 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238105 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238136 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238166 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238194 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238224 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238260 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238311 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238341 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238370 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238399 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238426 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238455 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238491 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238527 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238570 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238604 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238636 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238665 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238695 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238725 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238756 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238787 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238815 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238825 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238845 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238977 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239021 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239055 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239088 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239124 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239109 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239158 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239193 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239212 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239226 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239257 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239273 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239348 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239389 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239420 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.238898 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239454 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239497 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239521 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239554 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239584 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239617 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239650 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239683 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239669 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239716 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239749 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239829 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239881 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239925 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239962 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239997 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240030 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240065 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240096 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240134 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240166 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240201 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240232 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240263 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240347 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240393 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240427 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240458 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240490 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240524 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240557 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240592 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240624 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240657 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240698 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240734 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240786 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240817 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240853 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240885 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240916 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240947 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240979 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241012 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241045 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241076 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241106 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241138 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241171 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241203 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241244 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241305 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241339 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241389 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241438 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241485 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241534 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241583 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241630 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241684 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241736 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241788 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241837 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241889 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241938 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.242093 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.242152 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.242207 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.242258 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.242371 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.242424 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.242475 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.242520 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.242573 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.242626 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.242679 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.242731 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.242780 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.242827 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.242875 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.242925 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.242979 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.243028 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.243088 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.243137 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.243188 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.243239 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.243325 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.243380 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.243433 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.243485 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.243535 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.243583 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.243664 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.243715 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.243797 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.243851 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.243904 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.243955 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.244025 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.244075 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.244123 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.244173 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.244219 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.244309 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.244368 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.244419 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.244465 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.244513 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.244561 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.244613 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.244666 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.244719 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.244772 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.244822 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.244874 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.244921 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.244967 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.245013 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.245060 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.245109 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.245155 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.245205 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.245257 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.245347 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.245399 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.245452 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.245504 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.245559 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.245605 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.245650 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.245703 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.245750 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.245804 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.245855 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.245910 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.245961 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.246008 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.246064 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.246113 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.246167 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.246219 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.246269 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.246377 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.246433 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.246484 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.246531 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.246587 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.246637 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.246697 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.246749 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.246801 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.246848 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.246895 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239778 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239807 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239868 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.239968 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240077 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240217 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240369 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240386 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240407 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240433 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240653 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240707 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.240843 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241018 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241034 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241318 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241490 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241434 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241538 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.241735 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.242000 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.242096 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.242108 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.242701 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.242719 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.242938 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.242529 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.243077 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.243400 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.243464 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.243628 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.243794 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.243814 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.243919 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.244054 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.244085 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.244586 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.244658 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.245050 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.245378 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.245770 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.245852 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.245851 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.246526 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.246610 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.246948 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.247602 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.247666 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.247905 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.247921 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.248144 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.248319 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.248963 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.249174 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-host-var-lib-cni-multus\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.249224 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-run-openvswitch\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.249266 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/19bf4391-88b7-43a0-9b6a-435261a44ed5-cnibin\") pod \"multus-additional-cni-plugins-g4qlg\" (UID: \"19bf4391-88b7-43a0-9b6a-435261a44ed5\") " pod="openshift-multus/multus-additional-cni-plugins-g4qlg" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.249361 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-host-run-multus-certs\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.249426 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x656g\" (UniqueName: \"kubernetes.io/projected/cf5fc763-08fb-4b02-a3cd-6f85310f0e14-kube-api-access-x656g\") pod \"node-ca-r9vtf\" (UID: \"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\") " pod="openshift-image-registry/node-ca-r9vtf" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.249443 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.249477 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.249483 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.249448 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.249513 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-cni-netd\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.249549 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.249606 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/19bf4391-88b7-43a0-9b6a-435261a44ed5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g4qlg\" (UID: \"19bf4391-88b7-43a0-9b6a-435261a44ed5\") " pod="openshift-multus/multus-additional-cni-plugins-g4qlg" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.249651 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.249762 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3f813da7-84d4-4550-ad66-f282814444a3-multus-daemon-config\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.249820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.249938 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.250074 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.250148 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-68xpr\" (UID: \"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.250188 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-host-run-k8s-cni-cncf-io\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.250224 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-host-var-lib-kubelet\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.250263 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf5fc763-08fb-4b02-a3cd-6f85310f0e14-host\") pod \"node-ca-r9vtf\" (UID: \"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\") " pod="openshift-image-registry/node-ca-r9vtf" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.250330 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2153d97b-a108-49f8-b6c8-8223ea65b878-env-overrides\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.250414 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.250577 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.250788 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.250840 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w749n\" (UniqueName: \"kubernetes.io/projected/12151228-1cb9-4086-9a62-f4a9583f5f69-kube-api-access-w749n\") pod \"machine-config-daemon-fxqfd\" (UID: \"12151228-1cb9-4086-9a62-f4a9583f5f69\") " pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.250886 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-cnibin\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.250922 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cf5fc763-08fb-4b02-a3cd-6f85310f0e14-serviceca\") pod \"node-ca-r9vtf\" (UID: \"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\") " pod="openshift-image-registry/node-ca-r9vtf" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.250955 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.250989 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-node-log\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251022 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2153d97b-a108-49f8-b6c8-8223ea65b878-ovnkube-script-lib\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251113 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251148 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-systemd-units\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251183 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2153d97b-a108-49f8-b6c8-8223ea65b878-ovnkube-config\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251223 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs\") pod \"network-metrics-daemon-k56zh\" (UID: \"6d19b89e-d048-4656-b5ce-c637190ab678\") " pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251249 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251256 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-hostroot\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251425 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-multus-conf-dir\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251471 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/19bf4391-88b7-43a0-9b6a-435261a44ed5-cni-binary-copy\") pod \"multus-additional-cni-plugins-g4qlg\" (UID: \"19bf4391-88b7-43a0-9b6a-435261a44ed5\") " pod="openshift-multus/multus-additional-cni-plugins-g4qlg" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251518 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppqc8\" (UniqueName: \"kubernetes.io/projected/19bf4391-88b7-43a0-9b6a-435261a44ed5-kube-api-access-ppqc8\") pod \"multus-additional-cni-plugins-g4qlg\" (UID: \"19bf4391-88b7-43a0-9b6a-435261a44ed5\") " pod="openshift-multus/multus-additional-cni-plugins-g4qlg" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251555 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-68xpr\" (UID: \"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251577 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251598 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-68xpr\" (UID: \"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251638 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-multus-cni-dir\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251671 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3f813da7-84d4-4550-ad66-f282814444a3-cni-binary-copy\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251702 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-kubelet\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251738 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-run-netns\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251799 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgjwb\" (UniqueName: \"kubernetes.io/projected/fdf0a692-3cf9-4abe-8b52-c81a040c0e54-kube-api-access-mgjwb\") pod \"node-resolver-fnwpn\" (UID: \"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\") " pod="openshift-dns/node-resolver-fnwpn" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251837 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251868 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-slash\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251893 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-run-systemd\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251915 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-cni-bin\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251939 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/19bf4391-88b7-43a0-9b6a-435261a44ed5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g4qlg\" (UID: \"19bf4391-88b7-43a0-9b6a-435261a44ed5\") " pod="openshift-multus/multus-additional-cni-plugins-g4qlg" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251970 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251981 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.252092 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.252147 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.252813 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.252944 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.253095 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:04.753008915 +0000 UTC m=+81.302666602 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.253148 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.253849 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.253941 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.254128 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.254175 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.254228 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.254520 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.256020 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.256158 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.256069 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.256441 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.254959 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.256612 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.256732 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.256897 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.258562 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.258645 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.258995 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.259008 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.259726 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.259899 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:04.759841891 +0000 UTC m=+81.309499568 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.251995 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/12151228-1cb9-4086-9a62-f4a9583f5f69-rootfs\") pod \"machine-config-daemon-fxqfd\" (UID: \"12151228-1cb9-4086-9a62-f4a9583f5f69\") " pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.260464 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45656\" (UniqueName: \"kubernetes.io/projected/c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5-kube-api-access-45656\") pod \"ovnkube-control-plane-749d76644c-68xpr\" (UID: \"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.260570 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcf5r\" (UniqueName: \"kubernetes.io/projected/6d19b89e-d048-4656-b5ce-c637190ab678-kube-api-access-mcf5r\") pod \"network-metrics-daemon-k56zh\" (UID: \"6d19b89e-d048-4656-b5ce-c637190ab678\") " pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.260601 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-system-cni-dir\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.260689 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-os-release\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.260745 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-multus-socket-dir-parent\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.260773 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2153d97b-a108-49f8-b6c8-8223ea65b878-ovn-node-metrics-cert\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.260794 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77vkc\" (UniqueName: \"kubernetes.io/projected/2153d97b-a108-49f8-b6c8-8223ea65b878-kube-api-access-77vkc\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.260819 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.260845 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.260864 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.260893 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.260921 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.261031 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.261151 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-run-ovn-kubernetes\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.261221 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12151228-1cb9-4086-9a62-f4a9583f5f69-proxy-tls\") pod \"machine-config-daemon-fxqfd\" (UID: \"12151228-1cb9-4086-9a62-f4a9583f5f69\") " pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.261249 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-etc-kubernetes\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.261306 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-etc-openvswitch\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.261490 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.261598 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-log-socket\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.261626 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/19bf4391-88b7-43a0-9b6a-435261a44ed5-os-release\") pod \"multus-additional-cni-plugins-g4qlg\" (UID: \"19bf4391-88b7-43a0-9b6a-435261a44ed5\") " pod="openshift-multus/multus-additional-cni-plugins-g4qlg" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.261635 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.261651 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/12151228-1cb9-4086-9a62-f4a9583f5f69-mcd-auth-proxy-config\") pod \"machine-config-daemon-fxqfd\" (UID: \"12151228-1cb9-4086-9a62-f4a9583f5f69\") " pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.261740 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-host-run-netns\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.261978 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.262018 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.262031 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-run-ovn\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.262117 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fdf0a692-3cf9-4abe-8b52-c81a040c0e54-hosts-file\") pod \"node-resolver-fnwpn\" (UID: \"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\") " pod="openshift-dns/node-resolver-fnwpn" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.262167 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-host-var-lib-cni-bin\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.262165 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.262212 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkbh9\" (UniqueName: \"kubernetes.io/projected/3f813da7-84d4-4550-ad66-f282814444a3-kube-api-access-xkbh9\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.262250 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-var-lib-openvswitch\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.262461 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/19bf4391-88b7-43a0-9b6a-435261a44ed5-system-cni-dir\") pod \"multus-additional-cni-plugins-g4qlg\" (UID: \"19bf4391-88b7-43a0-9b6a-435261a44ed5\") " pod="openshift-multus/multus-additional-cni-plugins-g4qlg" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.262056 4749 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.262695 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.262987 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.263305 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.263361 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.263680 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:14:04.763635142 +0000 UTC m=+81.313292819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.264083 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.264120 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.264321 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.264404 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.264423 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.264599 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.264632 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.264728 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.264739 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.264770 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.264808 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.264912 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.265235 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.266555 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.266640 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.266792 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.266805 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.266826 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.266841 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.266965 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.267456 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.267575 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.267709 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.263533 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.268680 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.268701 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.268754 4749 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.268770 4749 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.268807 4749 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.268966 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.268983 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.268996 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269009 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269022 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269035 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269048 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269061 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269076 4749 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269088 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269100 4749 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269112 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269124 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269136 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269148 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269160 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269172 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269184 4749 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269197 4749 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269209 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269221 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269237 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269249 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269261 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269272 4749 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269301 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269314 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269325 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269336 4749 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269348 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269360 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269372 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269384 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269396 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269407 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269419 4749 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269431 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269444 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269456 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269468 4749 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269480 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269491 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269503 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269516 4749 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269527 4749 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269539 4749 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269551 4749 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269563 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269574 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269585 4749 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269597 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269608 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269622 4749 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269633 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269645 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269658 4749 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269673 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269684 4749 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269696 4749 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269708 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.269720 4749 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.265959 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.270419 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.270745 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.270770 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.283427 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.286636 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.286923 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.287057 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.287086 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.287195 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.287352 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.287582 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.288188 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.288210 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.288223 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.288215 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.287937 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.288306 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:04.788269309 +0000 UTC m=+81.337927106 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.288447 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.288476 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.288497 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.288843 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.289270 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:04.788548595 +0000 UTC m=+81.338206282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.289967 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.289995 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.290032 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.290315 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.290630 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.290708 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.291670 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.292482 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.292660 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.292775 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.293332 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.293315 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.293879 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.294246 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.294252 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.294457 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.294665 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.294777 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.294695 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.295029 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.296117 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.294886 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.296810 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.296849 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:04Z","lastTransitionTime":"2026-03-20T07:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.302018 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.303125 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.305496 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.306928 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.306891 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.319582 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.331909 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.333606 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.334368 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.336250 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.354538 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.355688 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.357603 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.357866 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.358383 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.358414 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.361505 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.362610 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.363858 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.364053 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.364080 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.364199 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.364573 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.364339 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.367444 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.367524 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.367784 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.367802 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.367800 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.367945 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.368311 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.368523 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.368580 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.368881 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.368908 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.369034 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.369057 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.369091 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.369211 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.369349 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.369375 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.369442 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.369470 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.369738 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.369897 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370047 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370131 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgjwb\" (UniqueName: \"kubernetes.io/projected/fdf0a692-3cf9-4abe-8b52-c81a040c0e54-kube-api-access-mgjwb\") pod \"node-resolver-fnwpn\" (UID: \"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\") " pod="openshift-dns/node-resolver-fnwpn" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370185 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-slash\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370206 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-run-systemd\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370227 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-cni-bin\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370247 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/19bf4391-88b7-43a0-9b6a-435261a44ed5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g4qlg\" (UID: \"19bf4391-88b7-43a0-9b6a-435261a44ed5\") " pod="openshift-multus/multus-additional-cni-plugins-g4qlg" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370268 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-multus-socket-dir-parent\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370317 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/12151228-1cb9-4086-9a62-f4a9583f5f69-rootfs\") pod \"machine-config-daemon-fxqfd\" (UID: \"12151228-1cb9-4086-9a62-f4a9583f5f69\") " pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370375 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45656\" (UniqueName: \"kubernetes.io/projected/c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5-kube-api-access-45656\") pod \"ovnkube-control-plane-749d76644c-68xpr\" (UID: \"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370399 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcf5r\" (UniqueName: \"kubernetes.io/projected/6d19b89e-d048-4656-b5ce-c637190ab678-kube-api-access-mcf5r\") pod \"network-metrics-daemon-k56zh\" (UID: \"6d19b89e-d048-4656-b5ce-c637190ab678\") " pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370420 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-system-cni-dir\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370439 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-os-release\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370460 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-run-ovn-kubernetes\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370480 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2153d97b-a108-49f8-b6c8-8223ea65b878-ovn-node-metrics-cert\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370509 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77vkc\" (UniqueName: \"kubernetes.io/projected/2153d97b-a108-49f8-b6c8-8223ea65b878-kube-api-access-77vkc\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370521 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-slash\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370530 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370566 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12151228-1cb9-4086-9a62-f4a9583f5f69-proxy-tls\") pod \"machine-config-daemon-fxqfd\" (UID: \"12151228-1cb9-4086-9a62-f4a9583f5f69\") " pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370588 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-etc-kubernetes\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370607 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-etc-openvswitch\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370627 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-log-socket\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370647 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/19bf4391-88b7-43a0-9b6a-435261a44ed5-os-release\") pod \"multus-additional-cni-plugins-g4qlg\" (UID: \"19bf4391-88b7-43a0-9b6a-435261a44ed5\") " pod="openshift-multus/multus-additional-cni-plugins-g4qlg" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370676 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/12151228-1cb9-4086-9a62-f4a9583f5f69-mcd-auth-proxy-config\") pod \"machine-config-daemon-fxqfd\" (UID: \"12151228-1cb9-4086-9a62-f4a9583f5f69\") " pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370697 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-host-run-netns\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370717 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-run-ovn\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370738 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fdf0a692-3cf9-4abe-8b52-c81a040c0e54-hosts-file\") pod \"node-resolver-fnwpn\" (UID: \"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\") " pod="openshift-dns/node-resolver-fnwpn" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370758 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-host-var-lib-cni-bin\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370781 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkbh9\" (UniqueName: \"kubernetes.io/projected/3f813da7-84d4-4550-ad66-f282814444a3-kube-api-access-xkbh9\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370802 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-var-lib-openvswitch\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370822 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/19bf4391-88b7-43a0-9b6a-435261a44ed5-system-cni-dir\") pod \"multus-additional-cni-plugins-g4qlg\" (UID: \"19bf4391-88b7-43a0-9b6a-435261a44ed5\") " pod="openshift-multus/multus-additional-cni-plugins-g4qlg" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370844 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-host-var-lib-cni-multus\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370864 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-run-openvswitch\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370884 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/19bf4391-88b7-43a0-9b6a-435261a44ed5-cnibin\") pod \"multus-additional-cni-plugins-g4qlg\" (UID: \"19bf4391-88b7-43a0-9b6a-435261a44ed5\") " pod="openshift-multus/multus-additional-cni-plugins-g4qlg" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370904 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-host-run-multus-certs\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370925 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x656g\" (UniqueName: \"kubernetes.io/projected/cf5fc763-08fb-4b02-a3cd-6f85310f0e14-kube-api-access-x656g\") pod \"node-ca-r9vtf\" (UID: \"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\") " pod="openshift-image-registry/node-ca-r9vtf" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.370949 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-cni-netd\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371007 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371009 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/19bf4391-88b7-43a0-9b6a-435261a44ed5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g4qlg\" (UID: \"19bf4391-88b7-43a0-9b6a-435261a44ed5\") " pod="openshift-multus/multus-additional-cni-plugins-g4qlg" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371031 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/19bf4391-88b7-43a0-9b6a-435261a44ed5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g4qlg\" (UID: \"19bf4391-88b7-43a0-9b6a-435261a44ed5\") " pod="openshift-multus/multus-additional-cni-plugins-g4qlg" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371042 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-multus-socket-dir-parent\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371057 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-run-systemd\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371072 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/12151228-1cb9-4086-9a62-f4a9583f5f69-rootfs\") pod \"machine-config-daemon-fxqfd\" (UID: \"12151228-1cb9-4086-9a62-f4a9583f5f69\") " pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371071 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3f813da7-84d4-4550-ad66-f282814444a3-multus-daemon-config\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371088 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-cni-bin\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371121 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-68xpr\" (UID: \"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371137 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-host-run-k8s-cni-cncf-io\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371154 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-host-var-lib-kubelet\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371169 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf5fc763-08fb-4b02-a3cd-6f85310f0e14-host\") pod \"node-ca-r9vtf\" (UID: \"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\") " pod="openshift-image-registry/node-ca-r9vtf" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371190 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2153d97b-a108-49f8-b6c8-8223ea65b878-env-overrides\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371205 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2153d97b-a108-49f8-b6c8-8223ea65b878-ovnkube-script-lib\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371232 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w749n\" (UniqueName: \"kubernetes.io/projected/12151228-1cb9-4086-9a62-f4a9583f5f69-kube-api-access-w749n\") pod \"machine-config-daemon-fxqfd\" (UID: \"12151228-1cb9-4086-9a62-f4a9583f5f69\") " pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371246 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-cnibin\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371259 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cf5fc763-08fb-4b02-a3cd-6f85310f0e14-serviceca\") pod \"node-ca-r9vtf\" (UID: \"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\") " pod="openshift-image-registry/node-ca-r9vtf" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371274 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371436 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/19bf4391-88b7-43a0-9b6a-435261a44ed5-cnibin\") pod \"multus-additional-cni-plugins-g4qlg\" (UID: \"19bf4391-88b7-43a0-9b6a-435261a44ed5\") " pod="openshift-multus/multus-additional-cni-plugins-g4qlg" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371467 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-node-log\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371479 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-var-lib-openvswitch\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371511 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-systemd-units\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371534 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/19bf4391-88b7-43a0-9b6a-435261a44ed5-system-cni-dir\") pod \"multus-additional-cni-plugins-g4qlg\" (UID: \"19bf4391-88b7-43a0-9b6a-435261a44ed5\") " pod="openshift-multus/multus-additional-cni-plugins-g4qlg" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371543 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2153d97b-a108-49f8-b6c8-8223ea65b878-ovnkube-config\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371546 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371562 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-host-var-lib-cni-multus\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371569 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-68xpr\" (UID: \"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371588 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-run-openvswitch\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371590 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs\") pod \"network-metrics-daemon-k56zh\" (UID: \"6d19b89e-d048-4656-b5ce-c637190ab678\") " pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371634 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-hostroot\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371632 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-host-var-lib-kubelet\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371655 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-multus-conf-dir\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371676 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/19bf4391-88b7-43a0-9b6a-435261a44ed5-cni-binary-copy\") pod \"multus-additional-cni-plugins-g4qlg\" (UID: \"19bf4391-88b7-43a0-9b6a-435261a44ed5\") " pod="openshift-multus/multus-additional-cni-plugins-g4qlg" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371672 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371702 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppqc8\" (UniqueName: \"kubernetes.io/projected/19bf4391-88b7-43a0-9b6a-435261a44ed5-kube-api-access-ppqc8\") pod \"multus-additional-cni-plugins-g4qlg\" (UID: \"19bf4391-88b7-43a0-9b6a-435261a44ed5\") " pod="openshift-multus/multus-additional-cni-plugins-g4qlg" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371739 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-68xpr\" (UID: \"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371765 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-multus-cni-dir\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371789 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3f813da7-84d4-4550-ad66-f282814444a3-cni-binary-copy\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371809 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-kubelet\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371829 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-run-netns\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371887 4749 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371903 4749 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371918 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371931 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371945 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371959 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371972 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371985 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371997 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372009 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372021 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372032 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372045 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372058 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372071 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372082 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372095 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372108 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372120 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372134 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372144 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-systemd-units\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372149 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372174 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf5fc763-08fb-4b02-a3cd-6f85310f0e14-host\") pod \"node-ca-r9vtf\" (UID: \"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\") " pod="openshift-image-registry/node-ca-r9vtf" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372178 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372203 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372215 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372227 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372240 4749 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372252 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372268 4749 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372306 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372326 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372344 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372359 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372376 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372377 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372391 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372408 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372424 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372439 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372454 4749 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372469 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372484 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372501 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372518 4749 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372534 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372551 4749 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372567 4749 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372583 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372598 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372614 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372631 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372648 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372664 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372680 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372697 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372715 4749 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372732 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372748 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372765 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372781 4749 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372793 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372805 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372818 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372831 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372843 4749 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372854 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372867 4749 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372878 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372890 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372902 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372911 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2153d97b-a108-49f8-b6c8-8223ea65b878-ovnkube-config\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372915 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372956 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372970 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372982 4749 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.372995 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373008 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373024 4749 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373036 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373049 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373061 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373073 4749 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373085 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373097 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373109 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373122 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373136 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373147 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373160 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373173 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373188 4749 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373200 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373214 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373226 4749 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373239 4749 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373251 4749 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373265 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373296 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373310 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373323 4749 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373335 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373348 4749 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373360 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373371 4749 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373383 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373395 4749 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373407 4749 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373418 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373430 4749 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373432 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-68xpr\" (UID: \"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373440 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373474 4749 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373486 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373495 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373504 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373512 4749 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373523 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373529 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2153d97b-a108-49f8-b6c8-8223ea65b878-env-overrides\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373553 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-host-run-k8s-cni-cncf-io\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373533 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373575 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373585 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373594 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373604 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373613 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373622 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373632 4749 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373664 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-system-cni-dir\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.373915 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-hostroot\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.373963 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.373997 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs podName:6d19b89e-d048-4656-b5ce-c637190ab678 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:04.873984894 +0000 UTC m=+81.423642541 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs") pod "network-metrics-daemon-k56zh" (UID: "6d19b89e-d048-4656-b5ce-c637190ab678") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.374079 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-68xpr\" (UID: \"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.374115 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-multus-conf-dir\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.374207 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2153d97b-a108-49f8-b6c8-8223ea65b878-ovnkube-script-lib\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.374349 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12151228-1cb9-4086-9a62-f4a9583f5f69-proxy-tls\") pod \"machine-config-daemon-fxqfd\" (UID: \"12151228-1cb9-4086-9a62-f4a9583f5f69\") " pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.374415 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-host-run-netns\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.374441 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-run-ovn\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.374493 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fdf0a692-3cf9-4abe-8b52-c81a040c0e54-hosts-file\") pod \"node-resolver-fnwpn\" (UID: \"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\") " pod="openshift-dns/node-resolver-fnwpn" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.374514 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-host-var-lib-cni-bin\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.374544 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-cnibin\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.374582 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-kubelet\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.374609 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-run-netns\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.371931 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-node-log\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.374651 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-cni-netd\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.374679 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-host-run-multus-certs\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.374866 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/19bf4391-88b7-43a0-9b6a-435261a44ed5-cni-binary-copy\") pod \"multus-additional-cni-plugins-g4qlg\" (UID: \"19bf4391-88b7-43a0-9b6a-435261a44ed5\") " pod="openshift-multus/multus-additional-cni-plugins-g4qlg" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.374879 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.374954 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-multus-cni-dir\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.375012 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-os-release\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.375051 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-run-ovn-kubernetes\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.375063 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/12151228-1cb9-4086-9a62-f4a9583f5f69-mcd-auth-proxy-config\") pod \"machine-config-daemon-fxqfd\" (UID: \"12151228-1cb9-4086-9a62-f4a9583f5f69\") " pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.375109 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f813da7-84d4-4550-ad66-f282814444a3-etc-kubernetes\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.375117 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-log-socket\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.375144 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-etc-openvswitch\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.375170 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/19bf4391-88b7-43a0-9b6a-435261a44ed5-os-release\") pod \"multus-additional-cni-plugins-g4qlg\" (UID: \"19bf4391-88b7-43a0-9b6a-435261a44ed5\") " pod="openshift-multus/multus-additional-cni-plugins-g4qlg" Mar 20 07:14:04 crc kubenswrapper[4749]: W0320 07:14:04.376946 4749 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.377067 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.377798 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cf5fc763-08fb-4b02-a3cd-6f85310f0e14-serviceca\") pod \"node-ca-r9vtf\" (UID: \"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\") " pod="openshift-image-registry/node-ca-r9vtf" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.379235 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2153d97b-a108-49f8-b6c8-8223ea65b878-ovn-node-metrics-cert\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.380482 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/19bf4391-88b7-43a0-9b6a-435261a44ed5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g4qlg\" (UID: \"19bf4391-88b7-43a0-9b6a-435261a44ed5\") " pod="openshift-multus/multus-additional-cni-plugins-g4qlg" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.385325 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.391168 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-68xpr\" (UID: \"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.391962 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3f813da7-84d4-4550-ad66-f282814444a3-cni-binary-copy\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.392551 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3f813da7-84d4-4550-ad66-f282814444a3-multus-daemon-config\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.392679 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.392832 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkbh9\" (UniqueName: \"kubernetes.io/projected/3f813da7-84d4-4550-ad66-f282814444a3-kube-api-access-xkbh9\") pod \"multus-rcq9v\" (UID: \"3f813da7-84d4-4550-ad66-f282814444a3\") " pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.393483 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.393986 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x656g\" (UniqueName: \"kubernetes.io/projected/cf5fc763-08fb-4b02-a3cd-6f85310f0e14-kube-api-access-x656g\") pod \"node-ca-r9vtf\" (UID: \"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\") " pod="openshift-image-registry/node-ca-r9vtf" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.394505 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45656\" (UniqueName: \"kubernetes.io/projected/c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5-kube-api-access-45656\") pod \"ovnkube-control-plane-749d76644c-68xpr\" (UID: \"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.394571 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77vkc\" (UniqueName: \"kubernetes.io/projected/2153d97b-a108-49f8-b6c8-8223ea65b878-kube-api-access-77vkc\") pod \"ovnkube-node-tdgcw\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.394731 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgjwb\" (UniqueName: \"kubernetes.io/projected/fdf0a692-3cf9-4abe-8b52-c81a040c0e54-kube-api-access-mgjwb\") pod \"node-resolver-fnwpn\" (UID: \"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\") " pod="openshift-dns/node-resolver-fnwpn" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.394957 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w749n\" (UniqueName: \"kubernetes.io/projected/12151228-1cb9-4086-9a62-f4a9583f5f69-kube-api-access-w749n\") pod \"machine-config-daemon-fxqfd\" (UID: \"12151228-1cb9-4086-9a62-f4a9583f5f69\") " pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.396392 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcf5r\" (UniqueName: \"kubernetes.io/projected/6d19b89e-d048-4656-b5ce-c637190ab678-kube-api-access-mcf5r\") pod \"network-metrics-daemon-k56zh\" (UID: \"6d19b89e-d048-4656-b5ce-c637190ab678\") " pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.396537 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppqc8\" (UniqueName: \"kubernetes.io/projected/19bf4391-88b7-43a0-9b6a-435261a44ed5-kube-api-access-ppqc8\") pod \"multus-additional-cni-plugins-g4qlg\" (UID: \"19bf4391-88b7-43a0-9b6a-435261a44ed5\") " pod="openshift-multus/multus-additional-cni-plugins-g4qlg" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.400079 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.400115 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.400127 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.400147 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.400159 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:04Z","lastTransitionTime":"2026-03-20T07:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.403049 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.403059 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.407558 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.409227 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.413644 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.418108 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.431649 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.441127 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.448896 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.462928 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.468438 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.468929 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.468950 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.468958 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.468970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.468979 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:04Z","lastTransitionTime":"2026-03-20T07:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.479914 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.479951 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.479963 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.479973 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.479982 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.479990 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.480881 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.484581 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.484640 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.488565 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.488602 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.488612 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.488627 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.488638 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:04Z","lastTransitionTime":"2026-03-20T07:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.491214 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.498005 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-r9vtf" Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.501845 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.505089 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.505110 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.505118 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.505131 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.505141 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:04Z","lastTransitionTime":"2026-03-20T07:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.505371 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: W0320 07:14:04.510839 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-86bf81e427ae1bee34862a92890333bca9d46388ea58c91564a6abdaeea799d0 WatchSource:0}: Error finding container 86bf81e427ae1bee34862a92890333bca9d46388ea58c91564a6abdaeea799d0: Status 404 returned error can't find the container with id 86bf81e427ae1bee34862a92890333bca9d46388ea58c91564a6abdaeea799d0 Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.513821 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.519496 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.522426 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 07:14:04 crc kubenswrapper[4749]: W0320 07:14:04.522691 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf5fc763_08fb_4b02_a3cd_6f85310f0e14.slice/crio-3b3e3c50fb7b520e87f04b3930217818bb83d855cb976c21ec7ec643c781ba3c WatchSource:0}: Error finding container 3b3e3c50fb7b520e87f04b3930217818bb83d855cb976c21ec7ec643c781ba3c: Status 404 returned error can't find the container with id 3b3e3c50fb7b520e87f04b3930217818bb83d855cb976c21ec7ec643c781ba3c Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.524988 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.525019 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.525028 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.525042 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.525052 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:04Z","lastTransitionTime":"2026-03-20T07:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.525749 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.533973 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.538494 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.542343 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.545865 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.545931 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.545948 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.545973 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.545990 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:04Z","lastTransitionTime":"2026-03-20T07:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.546875 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-r9vtf" event={"ID":"cf5fc763-08fb-4b02-a3cd-6f85310f0e14","Type":"ContainerStarted","Data":"3b3e3c50fb7b520e87f04b3930217818bb83d855cb976c21ec7ec643c781ba3c"} Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.549594 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"86bf81e427ae1bee34862a92890333bca9d46388ea58c91564a6abdaeea799d0"} Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.550454 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.551518 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"62a846847f891636cb608615d8b203a268f32aacf80b6d2dbe4261b04bd307ef"} Mar 20 07:14:04 crc kubenswrapper[4749]: W0320 07:14:04.558003 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12151228_1cb9_4086_9a62_f4a9583f5f69.slice/crio-6a6e47219532bb8589fa08a6e406c39a13df3304dccf9d16d12ef52dd435ad67 WatchSource:0}: Error finding container 6a6e47219532bb8589fa08a6e406c39a13df3304dccf9d16d12ef52dd435ad67: Status 404 returned error can't find the container with id 6a6e47219532bb8589fa08a6e406c39a13df3304dccf9d16d12ef52dd435ad67 Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.558391 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.558538 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.559842 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.559865 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.559876 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.559894 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.559905 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:04Z","lastTransitionTime":"2026-03-20T07:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:04 crc kubenswrapper[4749]: W0320 07:14:04.563386 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-fbe1ef97bc4c3607a92b43325baa083ebc0e8413768a79c6eadc9219cd519817 WatchSource:0}: Error finding container fbe1ef97bc4c3607a92b43325baa083ebc0e8413768a79c6eadc9219cd519817: Status 404 returned error can't find the container with id fbe1ef97bc4c3607a92b43325baa083ebc0e8413768a79c6eadc9219cd519817 Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.563651 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: W0320 07:14:04.564278 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7e40eb5_dd2b_4e2e_8ae8_afcb760595b5.slice/crio-1de01cc82bdec1453fe52089657eb26298888bbbeba67ee51c356897ca5195c7 WatchSource:0}: Error finding container 1de01cc82bdec1453fe52089657eb26298888bbbeba67ee51c356897ca5195c7: Status 404 returned error can't find the container with id 1de01cc82bdec1453fe52089657eb26298888bbbeba67ee51c356897ca5195c7 Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.572815 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.584326 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.594520 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rcq9v" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.596466 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.606614 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.614695 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.624742 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.632994 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.635004 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fnwpn" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.662435 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.662773 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.662788 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.662803 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.662815 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:04Z","lastTransitionTime":"2026-03-20T07:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.663415 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:04 crc kubenswrapper[4749]: W0320 07:14:04.682985 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdf0a692_3cf9_4abe_8b52_c81a040c0e54.slice/crio-e8f9ed95d6c3f155d4413479e6821e12a603e5cc84c0466c00eb4793260ebe41 WatchSource:0}: Error finding container e8f9ed95d6c3f155d4413479e6821e12a603e5cc84c0466c00eb4793260ebe41: Status 404 returned error can't find the container with id e8f9ed95d6c3f155d4413479e6821e12a603e5cc84c0466c00eb4793260ebe41 Mar 20 07:14:04 crc kubenswrapper[4749]: W0320 07:14:04.686003 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2153d97b_a108_49f8_b6c8_8223ea65b878.slice/crio-689dbeb0340cfee1fccb56ef27e4c0b4ce438ecf525e95c1da70ea2bc9629731 WatchSource:0}: Error finding container 689dbeb0340cfee1fccb56ef27e4c0b4ce438ecf525e95c1da70ea2bc9629731: Status 404 returned error can't find the container with id 689dbeb0340cfee1fccb56ef27e4c0b4ce438ecf525e95c1da70ea2bc9629731 Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.766527 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.766568 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.766580 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.766596 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.766606 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:04Z","lastTransitionTime":"2026-03-20T07:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.782608 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.782742 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:14:05.782717321 +0000 UTC m=+82.332374998 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.782931 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.782971 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.783014 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.783067 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:05.783051828 +0000 UTC m=+82.332709475 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.783087 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.783132 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:05.78312285 +0000 UTC m=+82.332780507 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.868088 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.868118 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.868126 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.868139 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.868148 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:04Z","lastTransitionTime":"2026-03-20T07:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.883261 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs\") pod \"network-metrics-daemon-k56zh\" (UID: \"6d19b89e-d048-4656-b5ce-c637190ab678\") " pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.883317 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.883340 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.883458 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.883476 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.883485 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.883518 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:05.883506571 +0000 UTC m=+82.433164208 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.883763 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.883791 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs podName:6d19b89e-d048-4656-b5ce-c637190ab678 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:05.883783518 +0000 UTC m=+82.433441165 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs") pod "network-metrics-daemon-k56zh" (UID: "6d19b89e-d048-4656-b5ce-c637190ab678") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.883830 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.883839 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.883846 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:04 crc kubenswrapper[4749]: E0320 07:14:04.883865 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:05.883859669 +0000 UTC m=+82.433517316 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.971447 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.971491 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.971503 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.971530 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:04 crc kubenswrapper[4749]: I0320 07:14:04.971541 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:04Z","lastTransitionTime":"2026-03-20T07:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.075867 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.075930 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.075949 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.075976 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.075994 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:05Z","lastTransitionTime":"2026-03-20T07:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.179156 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.179200 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.179214 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.179234 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.179247 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:05Z","lastTransitionTime":"2026-03-20T07:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.280975 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.281014 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.281026 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.281042 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.281054 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:05Z","lastTransitionTime":"2026-03-20T07:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.383695 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.383751 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.383768 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.383792 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.383808 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:05Z","lastTransitionTime":"2026-03-20T07:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.486674 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.486724 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.486742 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.486767 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.486783 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:05Z","lastTransitionTime":"2026-03-20T07:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.556210 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerStarted","Data":"727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.556704 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerStarted","Data":"e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.556748 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerStarted","Data":"6a6e47219532bb8589fa08a6e406c39a13df3304dccf9d16d12ef52dd435ad67"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.558534 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.558606 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.558628 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"fbe1ef97bc4c3607a92b43325baa083ebc0e8413768a79c6eadc9219cd519817"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.560021 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.561936 4749 generic.go:334] "Generic (PLEG): container finished" podID="19bf4391-88b7-43a0-9b6a-435261a44ed5" containerID="1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf" exitCode=0 Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.561987 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" event={"ID":"19bf4391-88b7-43a0-9b6a-435261a44ed5","Type":"ContainerDied","Data":"1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.562023 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" event={"ID":"19bf4391-88b7-43a0-9b6a-435261a44ed5","Type":"ContainerStarted","Data":"9fa815388fd26f527a5bdaa28d13c74348b49a921d7ab35fbffe65737ab4b145"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.563786 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fnwpn" event={"ID":"fdf0a692-3cf9-4abe-8b52-c81a040c0e54","Type":"ContainerStarted","Data":"81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.563824 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fnwpn" event={"ID":"fdf0a692-3cf9-4abe-8b52-c81a040c0e54","Type":"ContainerStarted","Data":"e8f9ed95d6c3f155d4413479e6821e12a603e5cc84c0466c00eb4793260ebe41"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.565594 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rcq9v" event={"ID":"3f813da7-84d4-4550-ad66-f282814444a3","Type":"ContainerStarted","Data":"f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.565659 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rcq9v" event={"ID":"3f813da7-84d4-4550-ad66-f282814444a3","Type":"ContainerStarted","Data":"2f01444b078eba28fa0809cf352f2b05e33ce5f835da85496cea094d2c2614e2"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.567667 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-r9vtf" event={"ID":"cf5fc763-08fb-4b02-a3cd-6f85310f0e14","Type":"ContainerStarted","Data":"38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.569447 4749 generic.go:334] "Generic (PLEG): container finished" podID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerID="0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc" exitCode=0 Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.569545 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerDied","Data":"0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.569598 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerStarted","Data":"689dbeb0340cfee1fccb56ef27e4c0b4ce438ecf525e95c1da70ea2bc9629731"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.572219 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" event={"ID":"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5","Type":"ContainerStarted","Data":"c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.572270 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" event={"ID":"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5","Type":"ContainerStarted","Data":"080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.572353 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" event={"ID":"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5","Type":"ContainerStarted","Data":"1de01cc82bdec1453fe52089657eb26298888bbbeba67ee51c356897ca5195c7"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.590817 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.590850 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.590861 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.590877 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.590887 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:05Z","lastTransitionTime":"2026-03-20T07:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.596501 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.613065 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.628758 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.643719 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.666215 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.684194 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.693890 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.693930 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.693941 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.693957 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.693984 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:05Z","lastTransitionTime":"2026-03-20T07:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.705494 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.732509 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.761545 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.779868 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.793016 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.795876 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.795919 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.795932 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.795949 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.795961 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:05Z","lastTransitionTime":"2026-03-20T07:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.795983 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.796124 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:05 crc kubenswrapper[4749]: E0320 07:14:05.796174 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 07:14:05 crc kubenswrapper[4749]: E0320 07:14:05.796189 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:14:07.796161699 +0000 UTC m=+84.345819356 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:14:05 crc kubenswrapper[4749]: E0320 07:14:05.796219 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:07.796208301 +0000 UTC m=+84.345865958 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.796259 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:05 crc kubenswrapper[4749]: E0320 07:14:05.796363 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 07:14:05 crc kubenswrapper[4749]: E0320 07:14:05.796401 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:07.796388275 +0000 UTC m=+84.346045922 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.811348 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.827419 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.841113 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.859274 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.869777 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.883322 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.897501 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.897542 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.897582 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs\") pod \"network-metrics-daemon-k56zh\" (UID: \"6d19b89e-d048-4656-b5ce-c637190ab678\") " pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:05 crc kubenswrapper[4749]: E0320 07:14:05.897676 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 07:14:05 crc kubenswrapper[4749]: E0320 07:14:05.897692 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 07:14:05 crc kubenswrapper[4749]: E0320 07:14:05.897728 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs podName:6d19b89e-d048-4656-b5ce-c637190ab678 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:07.897715108 +0000 UTC m=+84.447372755 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs") pod "network-metrics-daemon-k56zh" (UID: "6d19b89e-d048-4656-b5ce-c637190ab678") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 07:14:05 crc kubenswrapper[4749]: E0320 07:14:05.897729 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 07:14:05 crc kubenswrapper[4749]: E0320 07:14:05.897747 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:05 crc kubenswrapper[4749]: E0320 07:14:05.897742 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 07:14:05 crc kubenswrapper[4749]: E0320 07:14:05.897771 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 07:14:05 crc kubenswrapper[4749]: E0320 07:14:05.897784 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:05 crc kubenswrapper[4749]: E0320 07:14:05.897816 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:07.89779787 +0000 UTC m=+84.447455527 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:05 crc kubenswrapper[4749]: E0320 07:14:05.897841 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:07.897824911 +0000 UTC m=+84.447482558 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.898919 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.898961 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.898975 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.898991 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.899011 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:05Z","lastTransitionTime":"2026-03-20T07:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.902991 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.914607 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.931386 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.946486 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.959753 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.970700 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.980748 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:05 crc kubenswrapper[4749]: I0320 07:14:05.991994 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.001488 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.001525 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.001537 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.001551 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.001561 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:06Z","lastTransitionTime":"2026-03-20T07:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.004354 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:06Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.013741 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:06Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.023355 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:06Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.104258 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.104326 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.104338 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.104356 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.104369 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:06Z","lastTransitionTime":"2026-03-20T07:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.176562 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.176629 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.176632 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.176697 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:06 crc kubenswrapper[4749]: E0320 07:14:06.177067 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:06 crc kubenswrapper[4749]: E0320 07:14:06.177570 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:06 crc kubenswrapper[4749]: E0320 07:14:06.177714 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:06 crc kubenswrapper[4749]: E0320 07:14:06.177467 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.185563 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.186714 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.188333 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.189277 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.190658 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.191440 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.192403 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.193858 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.194746 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.196073 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.196806 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.198441 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.199218 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.199997 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.201334 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.202097 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.203402 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.203954 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.204714 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.206079 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.206789 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.207213 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.207242 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.207250 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.207265 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.207273 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:06Z","lastTransitionTime":"2026-03-20T07:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.208098 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.208710 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.210237 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.211926 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.212752 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.214275 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.214951 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.216202 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.216846 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.218063 4749 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.218204 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.220683 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.221927 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.222432 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.223910 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.224650 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.226025 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.226710 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.227729 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.228195 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.229201 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.229863 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.230909 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.231459 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.232523 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.233235 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.234526 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.235008 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.235884 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.236440 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.237570 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.238131 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.238584 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.309584 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.309623 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.309634 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.309651 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.309662 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:06Z","lastTransitionTime":"2026-03-20T07:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.412270 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.412603 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.412613 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.412626 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.412636 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:06Z","lastTransitionTime":"2026-03-20T07:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.515562 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.515590 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.515597 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.515610 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.515618 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:06Z","lastTransitionTime":"2026-03-20T07:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.586411 4749 generic.go:334] "Generic (PLEG): container finished" podID="19bf4391-88b7-43a0-9b6a-435261a44ed5" containerID="766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17" exitCode=0 Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.586480 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" event={"ID":"19bf4391-88b7-43a0-9b6a-435261a44ed5","Type":"ContainerDied","Data":"766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17"} Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.601034 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerStarted","Data":"f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be"} Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.601079 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerStarted","Data":"ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e"} Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.601091 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerStarted","Data":"e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd"} Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.601105 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerStarted","Data":"adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189"} Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.601115 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerStarted","Data":"e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2"} Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.606868 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:06Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.630757 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.630810 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.630823 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.630843 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.630861 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:06Z","lastTransitionTime":"2026-03-20T07:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.630662 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:06Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.647896 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:06Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.657861 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:06Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.677677 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:06Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.694267 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:06Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.710618 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:06Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.722369 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:06Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.736650 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:06Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.736842 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.736856 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.736864 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.736875 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.736884 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:06Z","lastTransitionTime":"2026-03-20T07:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.757185 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:06Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.770058 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:06Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.785929 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:06Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.798734 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:06Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.809626 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:06Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.840016 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.840063 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.840074 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.840089 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.840098 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:06Z","lastTransitionTime":"2026-03-20T07:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.943531 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.943794 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.943803 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.943817 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:06 crc kubenswrapper[4749]: I0320 07:14:06.943825 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:06Z","lastTransitionTime":"2026-03-20T07:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.046241 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.046349 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.046368 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.046393 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.046409 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:07Z","lastTransitionTime":"2026-03-20T07:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.149343 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.149406 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.149424 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.149448 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.149467 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:07Z","lastTransitionTime":"2026-03-20T07:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.194318 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.194395 4749 scope.go:117] "RemoveContainer" containerID="f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8" Mar 20 07:14:07 crc kubenswrapper[4749]: E0320 07:14:07.194783 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.196046 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.252446 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.252507 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.252520 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.252539 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.252551 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:07Z","lastTransitionTime":"2026-03-20T07:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.356705 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.356767 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.356785 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.356809 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.356826 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:07Z","lastTransitionTime":"2026-03-20T07:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.459325 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.459379 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.459404 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.459428 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.459445 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:07Z","lastTransitionTime":"2026-03-20T07:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.562274 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.562375 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.562391 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.562415 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.562434 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:07Z","lastTransitionTime":"2026-03-20T07:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.607403 4749 generic.go:334] "Generic (PLEG): container finished" podID="19bf4391-88b7-43a0-9b6a-435261a44ed5" containerID="fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18" exitCode=0 Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.607475 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" event={"ID":"19bf4391-88b7-43a0-9b6a-435261a44ed5","Type":"ContainerDied","Data":"fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18"} Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.618460 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerStarted","Data":"390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e"} Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.619224 4749 scope.go:117] "RemoveContainer" containerID="f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8" Mar 20 07:14:07 crc kubenswrapper[4749]: E0320 07:14:07.619532 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.635312 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:07Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.664833 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:07Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.665483 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.665562 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.665589 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.665620 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.665638 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:07Z","lastTransitionTime":"2026-03-20T07:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.683535 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:07Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.719158 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:07Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.735653 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:07Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.751590 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:07Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.765704 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:07Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.767970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.768019 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.768032 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.768051 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.768065 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:07Z","lastTransitionTime":"2026-03-20T07:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.782199 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:07Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.798665 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:07Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.811301 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:07Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.819444 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.819582 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.819608 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:07 crc kubenswrapper[4749]: E0320 07:14:07.819686 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 07:14:07 crc kubenswrapper[4749]: E0320 07:14:07.819758 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:14:11.819691045 +0000 UTC m=+88.369348712 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:14:07 crc kubenswrapper[4749]: E0320 07:14:07.819783 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 07:14:07 crc kubenswrapper[4749]: E0320 07:14:07.819819 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:11.819805348 +0000 UTC m=+88.369463115 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 07:14:07 crc kubenswrapper[4749]: E0320 07:14:07.819865 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:11.819832118 +0000 UTC m=+88.369489775 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.822136 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:07Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.836176 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:07Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.847939 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:07Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.861003 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:07Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.869496 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:07Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.872388 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.872431 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.872441 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.872456 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.872469 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:07Z","lastTransitionTime":"2026-03-20T07:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.879860 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:07Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.920984 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.921038 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.921096 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs\") pod \"network-metrics-daemon-k56zh\" (UID: \"6d19b89e-d048-4656-b5ce-c637190ab678\") " pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:07 crc kubenswrapper[4749]: E0320 07:14:07.921124 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 07:14:07 crc kubenswrapper[4749]: E0320 07:14:07.921143 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 07:14:07 crc kubenswrapper[4749]: E0320 07:14:07.921154 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:07 crc kubenswrapper[4749]: E0320 07:14:07.921190 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 07:14:07 crc kubenswrapper[4749]: E0320 07:14:07.921199 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:11.921185833 +0000 UTC m=+88.470843480 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:07 crc kubenswrapper[4749]: E0320 07:14:07.921228 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs podName:6d19b89e-d048-4656-b5ce-c637190ab678 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:11.921215564 +0000 UTC m=+88.470873221 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs") pod "network-metrics-daemon-k56zh" (UID: "6d19b89e-d048-4656-b5ce-c637190ab678") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 07:14:07 crc kubenswrapper[4749]: E0320 07:14:07.921342 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 07:14:07 crc kubenswrapper[4749]: E0320 07:14:07.921380 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 07:14:07 crc kubenswrapper[4749]: E0320 07:14:07.921396 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:07 crc kubenswrapper[4749]: E0320 07:14:07.921528 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:11.92147921 +0000 UTC m=+88.471136927 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.976780 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.976838 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.976854 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.976880 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:07 crc kubenswrapper[4749]: I0320 07:14:07.976898 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:07Z","lastTransitionTime":"2026-03-20T07:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.079374 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.079431 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.079451 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.079475 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.079491 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:08Z","lastTransitionTime":"2026-03-20T07:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.177219 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.177243 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.177230 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:08 crc kubenswrapper[4749]: E0320 07:14:08.177384 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.177447 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:08 crc kubenswrapper[4749]: E0320 07:14:08.177594 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:08 crc kubenswrapper[4749]: E0320 07:14:08.177726 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:08 crc kubenswrapper[4749]: E0320 07:14:08.177914 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.182590 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.182624 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.182635 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.182648 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.182660 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:08Z","lastTransitionTime":"2026-03-20T07:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.286859 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.286909 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.286926 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.286952 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.286970 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:08Z","lastTransitionTime":"2026-03-20T07:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.390026 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.390105 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.390130 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.390159 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.390183 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:08Z","lastTransitionTime":"2026-03-20T07:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.492879 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.492940 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.492963 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.492987 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.493003 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:08Z","lastTransitionTime":"2026-03-20T07:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.596438 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.596505 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.596522 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.596562 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.596578 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:08Z","lastTransitionTime":"2026-03-20T07:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.624107 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c"} Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.628039 4749 generic.go:334] "Generic (PLEG): container finished" podID="19bf4391-88b7-43a0-9b6a-435261a44ed5" containerID="263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9" exitCode=0 Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.628142 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" event={"ID":"19bf4391-88b7-43a0-9b6a-435261a44ed5","Type":"ContainerDied","Data":"263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9"} Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.668436 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:08Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.690034 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:08Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.699832 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.699884 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.699901 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.699924 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.699944 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:08Z","lastTransitionTime":"2026-03-20T07:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.708691 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:08Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.723559 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:08Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.741035 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:08Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.788093 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:08Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.801877 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:08Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.804306 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.804336 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.804347 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.804364 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.804376 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:08Z","lastTransitionTime":"2026-03-20T07:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.820077 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:08Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.841690 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:08Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.855502 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:08Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.871622 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:08Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.885619 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:08Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.901044 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:08Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.910260 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.910401 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.910427 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.910458 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.910481 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:08Z","lastTransitionTime":"2026-03-20T07:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.920328 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:08Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.946553 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:08Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.962176 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:08Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.978092 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:08Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:08 crc kubenswrapper[4749]: I0320 07:14:08.995029 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:08Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.007570 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.013358 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.013401 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.013417 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.013437 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.013453 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:09Z","lastTransitionTime":"2026-03-20T07:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.021444 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.045005 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.062332 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.078133 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.089407 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.100808 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.111012 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.116256 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.116341 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.116360 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.116385 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.116403 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:09Z","lastTransitionTime":"2026-03-20T07:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.125961 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.146107 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.166954 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.197511 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.216666 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.218168 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.218203 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.218213 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.218228 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.218239 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:09Z","lastTransitionTime":"2026-03-20T07:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.231863 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.321107 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.321154 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.321170 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.321191 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.321208 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:09Z","lastTransitionTime":"2026-03-20T07:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.423994 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.424059 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.424084 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.424115 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.424138 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:09Z","lastTransitionTime":"2026-03-20T07:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.527314 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.527393 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.527415 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.527439 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.527458 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:09Z","lastTransitionTime":"2026-03-20T07:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.630103 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.630164 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.630183 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.630206 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.630223 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:09Z","lastTransitionTime":"2026-03-20T07:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.638426 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerStarted","Data":"8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea"} Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.642909 4749 generic.go:334] "Generic (PLEG): container finished" podID="19bf4391-88b7-43a0-9b6a-435261a44ed5" containerID="21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c" exitCode=0 Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.642973 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" event={"ID":"19bf4391-88b7-43a0-9b6a-435261a44ed5","Type":"ContainerDied","Data":"21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c"} Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.663328 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.686028 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.708523 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.725785 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.732887 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.732949 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.732972 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.733001 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.733023 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:09Z","lastTransitionTime":"2026-03-20T07:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.744688 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.758511 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.772903 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.792335 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.809227 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.827149 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.835981 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.836032 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.836050 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.836071 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.836085 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:09Z","lastTransitionTime":"2026-03-20T07:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.844107 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.859271 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.871215 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.888669 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.913276 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.934898 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:09Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.938543 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.938569 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.938578 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.938591 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:09 crc kubenswrapper[4749]: I0320 07:14:09.938601 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:09Z","lastTransitionTime":"2026-03-20T07:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.042445 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.042494 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.042512 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.042534 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.042550 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:10Z","lastTransitionTime":"2026-03-20T07:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.145115 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.145159 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.145176 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.145197 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.145216 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:10Z","lastTransitionTime":"2026-03-20T07:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.176927 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:10 crc kubenswrapper[4749]: E0320 07:14:10.177092 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.177558 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:10 crc kubenswrapper[4749]: E0320 07:14:10.177669 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.177745 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:10 crc kubenswrapper[4749]: E0320 07:14:10.177824 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.177999 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:10 crc kubenswrapper[4749]: E0320 07:14:10.178115 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.248155 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.248214 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.248237 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.248268 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.248319 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:10Z","lastTransitionTime":"2026-03-20T07:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.351677 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.351755 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.351773 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.351798 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.351818 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:10Z","lastTransitionTime":"2026-03-20T07:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.463450 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.463502 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.463518 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.463542 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.463559 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:10Z","lastTransitionTime":"2026-03-20T07:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.567091 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.567162 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.567185 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.567216 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.567239 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:10Z","lastTransitionTime":"2026-03-20T07:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.654614 4749 generic.go:334] "Generic (PLEG): container finished" podID="19bf4391-88b7-43a0-9b6a-435261a44ed5" containerID="812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc" exitCode=0 Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.654687 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" event={"ID":"19bf4391-88b7-43a0-9b6a-435261a44ed5","Type":"ContainerDied","Data":"812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc"} Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.670009 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.670064 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.670086 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.670116 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.670140 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:10Z","lastTransitionTime":"2026-03-20T07:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.687941 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:10Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.722327 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:10Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.739747 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:10Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.751904 4749 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.758652 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:10Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.774461 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:10Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.776010 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.776066 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.776086 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.776112 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.776132 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:10Z","lastTransitionTime":"2026-03-20T07:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.793510 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:10Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.813542 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:10Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.833518 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:10Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.855730 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:10Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.879902 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.879944 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.879955 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.879972 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.879984 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:10Z","lastTransitionTime":"2026-03-20T07:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.884518 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:10Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.906025 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:10Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.924178 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:10Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.942131 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:10Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.964011 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:10Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.982841 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.982908 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.982929 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.982955 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.982976 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:10Z","lastTransitionTime":"2026-03-20T07:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:10 crc kubenswrapper[4749]: I0320 07:14:10.984918 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:10Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.042741 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:11Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.085964 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.086004 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.086019 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.086038 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.086054 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:11Z","lastTransitionTime":"2026-03-20T07:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.190098 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.190178 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.190201 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.190232 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.190328 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:11Z","lastTransitionTime":"2026-03-20T07:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.293319 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.293363 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.293381 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.293403 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.293418 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:11Z","lastTransitionTime":"2026-03-20T07:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.396513 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.396557 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.396570 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.396591 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.396603 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:11Z","lastTransitionTime":"2026-03-20T07:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.498872 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.499166 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.499184 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.499213 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.499231 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:11Z","lastTransitionTime":"2026-03-20T07:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.602157 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.602227 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.602247 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.602270 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.602316 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:11Z","lastTransitionTime":"2026-03-20T07:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.666767 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerStarted","Data":"c5d9179b0df84772019234da3e43b581129874c5ec34980f9cb964380dbccfff"} Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.668175 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.668223 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.668351 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.688856 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" event={"ID":"19bf4391-88b7-43a0-9b6a-435261a44ed5","Type":"ContainerStarted","Data":"19e11d475744ecbce4f3285124657c66590dc339fe1af7d863b19d129ca09bbe"} Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.693120 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:11Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.705567 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.705601 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.705613 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.705629 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.705641 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:11Z","lastTransitionTime":"2026-03-20T07:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.710603 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.713390 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.719342 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:11Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.735186 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:11Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.755492 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:11Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.772635 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:11Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.786150 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:11Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.810242 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.810346 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.810375 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.810451 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.810515 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:11Z","lastTransitionTime":"2026-03-20T07:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.811166 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:11Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.833091 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5d9179b0df84772019234da3e43b581129874c5ec34980f9cb964380dbccfff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:11Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.859329 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:11Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:11 crc kubenswrapper[4749]: E0320 07:14:11.865785 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:14:19.865759893 +0000 UTC m=+96.415417570 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.865639 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.865963 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:11 crc kubenswrapper[4749]: E0320 07:14:11.866151 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 07:14:11 crc kubenswrapper[4749]: E0320 07:14:11.866256 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:19.866233334 +0000 UTC m=+96.415891021 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 07:14:11 crc kubenswrapper[4749]: E0320 07:14:11.867914 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 07:14:11 crc kubenswrapper[4749]: E0320 07:14:11.868017 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:19.867988346 +0000 UTC m=+96.417646003 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.868048 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.880138 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:11Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.913143 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.913417 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:11Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.913682 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.913736 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.913760 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.914197 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:11Z","lastTransitionTime":"2026-03-20T07:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.933764 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:11Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.953518 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:11Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.963674 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:11Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.968853 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.968925 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs\") pod \"network-metrics-daemon-k56zh\" (UID: \"6d19b89e-d048-4656-b5ce-c637190ab678\") " pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.968956 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:11 crc kubenswrapper[4749]: E0320 07:14:11.969071 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 07:14:11 crc kubenswrapper[4749]: E0320 07:14:11.969094 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 07:14:11 crc kubenswrapper[4749]: E0320 07:14:11.969108 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:11 crc kubenswrapper[4749]: E0320 07:14:11.969143 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 07:14:11 crc kubenswrapper[4749]: E0320 07:14:11.969181 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 07:14:11 crc kubenswrapper[4749]: E0320 07:14:11.969158 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:19.969138016 +0000 UTC m=+96.518795673 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:11 crc kubenswrapper[4749]: E0320 07:14:11.969225 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 07:14:11 crc kubenswrapper[4749]: E0320 07:14:11.969242 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs podName:6d19b89e-d048-4656-b5ce-c637190ab678 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:19.969224078 +0000 UTC m=+96.518881725 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs") pod "network-metrics-daemon-k56zh" (UID: "6d19b89e-d048-4656-b5ce-c637190ab678") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 07:14:11 crc kubenswrapper[4749]: E0320 07:14:11.969246 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:11 crc kubenswrapper[4749]: E0320 07:14:11.969349 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:19.969325061 +0000 UTC m=+96.518982738 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.972502 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:11Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.985628 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:11Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:11 crc kubenswrapper[4749]: I0320 07:14:11.996598 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:11Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.008352 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:12Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.016413 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.016455 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.016471 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.016492 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.016508 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:12Z","lastTransitionTime":"2026-03-20T07:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.020152 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:12Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.033901 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:12Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.046650 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:12Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.060703 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19e11d475744ecbce4f3285124657c66590dc339fe1af7d863b19d129ca09bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:12Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.071971 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:12Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.086861 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:12Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.103231 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:12Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.118075 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:12Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.120891 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.120950 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.120967 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.120991 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.121008 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:12Z","lastTransitionTime":"2026-03-20T07:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.137837 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:12Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.158208 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:12Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.176350 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:12 crc kubenswrapper[4749]: E0320 07:14:12.176553 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.176362 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:12 crc kubenswrapper[4749]: E0320 07:14:12.177458 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.177575 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.177708 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:12 crc kubenswrapper[4749]: E0320 07:14:12.177877 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:12 crc kubenswrapper[4749]: E0320 07:14:12.178140 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.178916 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5d9179b0df84772019234da3e43b581129874c5ec34980f9cb964380dbccfff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:12Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.208539 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:12Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.227993 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.228065 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.228088 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.228117 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.228140 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:12Z","lastTransitionTime":"2026-03-20T07:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.228145 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:12Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.246714 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:12Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.331415 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.331472 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.331490 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.331514 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.331531 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:12Z","lastTransitionTime":"2026-03-20T07:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.436867 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.436919 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.436938 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.436963 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.436980 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:12Z","lastTransitionTime":"2026-03-20T07:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.539505 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.539573 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.539592 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.539628 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.539645 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:12Z","lastTransitionTime":"2026-03-20T07:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.642363 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.642433 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.642456 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.642486 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.642511 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:12Z","lastTransitionTime":"2026-03-20T07:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.746093 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.746162 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.746183 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.746213 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.746230 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:12Z","lastTransitionTime":"2026-03-20T07:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.848721 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.848756 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.848766 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.848781 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.848792 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:12Z","lastTransitionTime":"2026-03-20T07:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.955478 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.955508 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.955518 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.955535 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:12 crc kubenswrapper[4749]: I0320 07:14:12.955547 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:12Z","lastTransitionTime":"2026-03-20T07:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.058420 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.058457 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.058471 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.058493 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.058507 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:13Z","lastTransitionTime":"2026-03-20T07:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.162005 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.162058 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.162075 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.162098 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.162115 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:13Z","lastTransitionTime":"2026-03-20T07:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.264526 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.264558 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.264568 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.264584 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.264595 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:13Z","lastTransitionTime":"2026-03-20T07:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.368010 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.368356 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.368531 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.368671 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.368801 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:13Z","lastTransitionTime":"2026-03-20T07:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.471824 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.472165 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.472345 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.472538 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.472676 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:13Z","lastTransitionTime":"2026-03-20T07:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.574928 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.575443 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.575524 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.575606 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.575714 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:13Z","lastTransitionTime":"2026-03-20T07:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.678180 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.678250 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.678324 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.678352 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.678370 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:13Z","lastTransitionTime":"2026-03-20T07:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.781853 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.781906 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.781922 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.781946 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.781965 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:13Z","lastTransitionTime":"2026-03-20T07:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.885278 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.885356 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.885372 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.885395 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.885412 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:13Z","lastTransitionTime":"2026-03-20T07:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.987276 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.987400 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.987418 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.987451 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:13 crc kubenswrapper[4749]: I0320 07:14:13.987468 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:13Z","lastTransitionTime":"2026-03-20T07:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.090760 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.090829 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.090849 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.090875 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.090901 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:14Z","lastTransitionTime":"2026-03-20T07:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.176475 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:14 crc kubenswrapper[4749]: E0320 07:14:14.176639 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.177070 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:14 crc kubenswrapper[4749]: E0320 07:14:14.177187 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.177277 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:14 crc kubenswrapper[4749]: E0320 07:14:14.177402 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.177563 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:14 crc kubenswrapper[4749]: E0320 07:14:14.177661 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.194675 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.194745 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.194769 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.194795 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.194818 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:14Z","lastTransitionTime":"2026-03-20T07:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.198860 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.211929 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.230506 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.252683 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.268103 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.284185 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.291989 4749 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.296131 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.296158 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.296166 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.296180 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.296189 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:14Z","lastTransitionTime":"2026-03-20T07:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.314129 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5d9179b0df84772019234da3e43b581129874c5ec34980f9cb964380dbccfff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.345446 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.360600 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.377955 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.400185 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.400267 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.400333 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.400367 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.400415 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:14Z","lastTransitionTime":"2026-03-20T07:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.436765 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.449373 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.461501 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.473833 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.486247 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.500415 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19e11d475744ecbce4f3285124657c66590dc339fe1af7d863b19d129ca09bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.502890 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.502925 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.502935 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.502950 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.502959 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:14Z","lastTransitionTime":"2026-03-20T07:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.604877 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.604914 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.604925 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.604941 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.604953 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:14Z","lastTransitionTime":"2026-03-20T07:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.702656 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdgcw_2153d97b-a108-49f8-b6c8-8223ea65b878/ovnkube-controller/0.log" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.706814 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerDied","Data":"c5d9179b0df84772019234da3e43b581129874c5ec34980f9cb964380dbccfff"} Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.707067 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.707125 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.707142 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.707170 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.707187 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:14Z","lastTransitionTime":"2026-03-20T07:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.706763 4749 generic.go:334] "Generic (PLEG): container finished" podID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerID="c5d9179b0df84772019234da3e43b581129874c5ec34980f9cb964380dbccfff" exitCode=1 Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.709339 4749 scope.go:117] "RemoveContainer" containerID="c5d9179b0df84772019234da3e43b581129874c5ec34980f9cb964380dbccfff" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.727354 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.746908 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.770039 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.786207 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.808293 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19e11d475744ecbce4f3285124657c66590dc339fe1af7d863b19d129ca09bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.809937 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.809992 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.810017 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.810049 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.810074 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:14Z","lastTransitionTime":"2026-03-20T07:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.824047 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.843745 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.859500 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.876141 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.887706 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.906463 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.912113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.912135 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.912144 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.912157 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.912166 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:14Z","lastTransitionTime":"2026-03-20T07:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.942227 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5d9179b0df84772019234da3e43b581129874c5ec34980f9cb964380dbccfff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5d9179b0df84772019234da3e43b581129874c5ec34980f9cb964380dbccfff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:13Z\\\",\\\"message\\\":\\\"14:13.947405 6554 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 07:14:13.947534 6554 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 07:14:13.947629 6554 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 07:14:13.947971 6554 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 07:14:13.948517 6554 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 07:14:13.948540 6554 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 07:14:13.948567 6554 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 07:14:13.948566 6554 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 07:14:13.948572 6554 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 07:14:13.948583 6554 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 07:14:13.948599 6554 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 07:14:13.948633 6554 factory.go:656] Stopping watch factory\\\\nI0320 07:14:13.948652 6554 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.954858 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.954943 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.954960 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.954986 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.955012 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:14Z","lastTransitionTime":"2026-03-20T07:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.973031 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: E0320 07:14:14.981606 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.986593 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.986658 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.986670 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.986745 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.986767 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:14Z","lastTransitionTime":"2026-03-20T07:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:14 crc kubenswrapper[4749]: I0320 07:14:14.993794 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:14Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.019116 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:15 crc kubenswrapper[4749]: E0320 07:14:15.019146 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.024022 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.024063 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.024076 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.024095 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.024108 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:15Z","lastTransitionTime":"2026-03-20T07:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.034706 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:15 crc kubenswrapper[4749]: E0320 07:14:15.044599 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.049841 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.049893 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.049907 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.049929 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.049944 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:15Z","lastTransitionTime":"2026-03-20T07:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:15 crc kubenswrapper[4749]: E0320 07:14:15.068824 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.073274 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.073354 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.073368 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.073389 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.073405 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:15Z","lastTransitionTime":"2026-03-20T07:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:15 crc kubenswrapper[4749]: E0320 07:14:15.090323 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:15 crc kubenswrapper[4749]: E0320 07:14:15.090474 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.092400 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.092439 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.092457 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.092477 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.092492 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:15Z","lastTransitionTime":"2026-03-20T07:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.195529 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.195594 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.195612 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.195642 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.195660 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:15Z","lastTransitionTime":"2026-03-20T07:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.298617 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.298661 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.298671 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.298684 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.298696 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:15Z","lastTransitionTime":"2026-03-20T07:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.400992 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.401033 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.401045 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.401064 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.401077 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:15Z","lastTransitionTime":"2026-03-20T07:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.504110 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.504155 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.504170 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.504192 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.504209 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:15Z","lastTransitionTime":"2026-03-20T07:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.610392 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.610426 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.610437 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.610450 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.610459 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:15Z","lastTransitionTime":"2026-03-20T07:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.712718 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.712758 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.712770 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.712788 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.712800 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:15Z","lastTransitionTime":"2026-03-20T07:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.717064 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdgcw_2153d97b-a108-49f8-b6c8-8223ea65b878/ovnkube-controller/0.log" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.720767 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerStarted","Data":"ee0e8c7afc39cdbcfdfb3a65e4f608334b574c4c5a19bc64de6ad9347174b9b3"} Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.721562 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.737910 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.756992 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.782089 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19e11d475744ecbce4f3285124657c66590dc339fe1af7d863b19d129ca09bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.797842 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.809420 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.815378 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.815428 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.815444 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.815466 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.815481 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:15Z","lastTransitionTime":"2026-03-20T07:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.823540 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.835633 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.850814 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.867535 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.880190 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.896592 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.910836 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.917997 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.918028 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.918038 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.918053 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.918062 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:15Z","lastTransitionTime":"2026-03-20T07:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.923114 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.942448 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.962484 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee0e8c7afc39cdbcfdfb3a65e4f608334b574c4c5a19bc64de6ad9347174b9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5d9179b0df84772019234da3e43b581129874c5ec34980f9cb964380dbccfff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:13Z\\\",\\\"message\\\":\\\"14:13.947405 6554 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 07:14:13.947534 6554 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 07:14:13.947629 6554 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 07:14:13.947971 6554 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 07:14:13.948517 6554 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 07:14:13.948540 6554 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 07:14:13.948567 6554 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 07:14:13.948566 6554 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 07:14:13.948572 6554 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 07:14:13.948583 6554 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 07:14:13.948599 6554 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 07:14:13.948633 6554 factory.go:656] Stopping watch factory\\\\nI0320 07:14:13.948652 6554 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:15 crc kubenswrapper[4749]: I0320 07:14:15.991470 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.020165 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.020198 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.020208 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.020225 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.020236 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:16Z","lastTransitionTime":"2026-03-20T07:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.122061 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.122108 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.122122 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.122138 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.122151 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:16Z","lastTransitionTime":"2026-03-20T07:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.177030 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.177139 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:16 crc kubenswrapper[4749]: E0320 07:14:16.177242 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.177309 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.177421 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:16 crc kubenswrapper[4749]: E0320 07:14:16.177475 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:16 crc kubenswrapper[4749]: E0320 07:14:16.177591 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:16 crc kubenswrapper[4749]: E0320 07:14:16.178377 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.225485 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.225550 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.225669 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.225713 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.225745 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:16Z","lastTransitionTime":"2026-03-20T07:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.328954 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.329111 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.329137 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.329167 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.329189 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:16Z","lastTransitionTime":"2026-03-20T07:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.431965 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.432026 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.432042 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.432068 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.432085 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:16Z","lastTransitionTime":"2026-03-20T07:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.534924 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.534977 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.534994 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.535016 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.535033 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:16Z","lastTransitionTime":"2026-03-20T07:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.638975 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.639044 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.639061 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.639086 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.639104 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:16Z","lastTransitionTime":"2026-03-20T07:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.726409 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdgcw_2153d97b-a108-49f8-b6c8-8223ea65b878/ovnkube-controller/1.log" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.727270 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdgcw_2153d97b-a108-49f8-b6c8-8223ea65b878/ovnkube-controller/0.log" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.730061 4749 generic.go:334] "Generic (PLEG): container finished" podID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerID="ee0e8c7afc39cdbcfdfb3a65e4f608334b574c4c5a19bc64de6ad9347174b9b3" exitCode=1 Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.730127 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerDied","Data":"ee0e8c7afc39cdbcfdfb3a65e4f608334b574c4c5a19bc64de6ad9347174b9b3"} Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.730222 4749 scope.go:117] "RemoveContainer" containerID="c5d9179b0df84772019234da3e43b581129874c5ec34980f9cb964380dbccfff" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.732580 4749 scope.go:117] "RemoveContainer" containerID="ee0e8c7afc39cdbcfdfb3a65e4f608334b574c4c5a19bc64de6ad9347174b9b3" Mar 20 07:14:16 crc kubenswrapper[4749]: E0320 07:14:16.733430 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tdgcw_openshift-ovn-kubernetes(2153d97b-a108-49f8-b6c8-8223ea65b878)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.743402 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.743474 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.743500 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.743531 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.743557 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:16Z","lastTransitionTime":"2026-03-20T07:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.748174 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:16Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.767444 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:16Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.791370 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19e11d475744ecbce4f3285124657c66590dc339fe1af7d863b19d129ca09bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:16Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.803744 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:16Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.818045 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:16Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.834252 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:16Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.846272 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.846317 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.846329 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.846345 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.846358 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:16Z","lastTransitionTime":"2026-03-20T07:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.851955 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:16Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.870520 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:16Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.898922 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:16Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.918156 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:16Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.934575 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:16Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.949095 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:16Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.949968 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.950055 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.950075 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.950103 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.950122 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:16Z","lastTransitionTime":"2026-03-20T07:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.969439 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:16Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:16 crc kubenswrapper[4749]: I0320 07:14:16.992586 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee0e8c7afc39cdbcfdfb3a65e4f608334b574c4c5a19bc64de6ad9347174b9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5d9179b0df84772019234da3e43b581129874c5ec34980f9cb964380dbccfff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:13Z\\\",\\\"message\\\":\\\"14:13.947405 6554 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 07:14:13.947534 6554 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 07:14:13.947629 6554 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 07:14:13.947971 6554 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 07:14:13.948517 6554 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 07:14:13.948540 6554 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 07:14:13.948567 6554 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 07:14:13.948566 6554 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 07:14:13.948572 6554 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 07:14:13.948583 6554 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 07:14:13.948599 6554 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 07:14:13.948633 6554 factory.go:656] Stopping watch factory\\\\nI0320 07:14:13.948652 6554 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee0e8c7afc39cdbcfdfb3a65e4f608334b574c4c5a19bc64de6ad9347174b9b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"message\\\":\\\"controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z]\\\\nI0320 07:14:15.792463 6747 services_controller.go:434] Service openshift-kube-apiserver-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-kube-apiserver-operator 70a45401-9850-413a-87c2-e90a7258374e 4267 0 2025-02-23 05:12:37 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:kube-apiserver-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:kube-apiserver-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00761a26b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:16Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.012029 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:17Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.027628 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:17Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.052890 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.052945 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.052962 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.052985 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.053003 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:17Z","lastTransitionTime":"2026-03-20T07:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.156063 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.156111 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.156127 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.156150 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.156161 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:17Z","lastTransitionTime":"2026-03-20T07:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.258962 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.259021 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.259038 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.259062 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.259080 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:17Z","lastTransitionTime":"2026-03-20T07:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.362438 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.362496 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.362514 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.362539 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.362562 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:17Z","lastTransitionTime":"2026-03-20T07:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.465219 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.465334 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.465363 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.465398 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.465419 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:17Z","lastTransitionTime":"2026-03-20T07:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.568277 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.568349 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.568363 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.568381 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.568396 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:17Z","lastTransitionTime":"2026-03-20T07:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.670263 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.670329 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.670343 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.670358 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.670369 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:17Z","lastTransitionTime":"2026-03-20T07:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.773244 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.773315 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.773331 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.773352 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.773368 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:17Z","lastTransitionTime":"2026-03-20T07:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.875851 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.875901 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.875918 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.875939 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.875955 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:17Z","lastTransitionTime":"2026-03-20T07:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.978978 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.979025 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.979041 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.979064 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:17 crc kubenswrapper[4749]: I0320 07:14:17.979081 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:17Z","lastTransitionTime":"2026-03-20T07:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.082632 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.082694 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.082716 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.082744 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.082766 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:18Z","lastTransitionTime":"2026-03-20T07:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.186189 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.186262 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.186315 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.186341 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.186359 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:18Z","lastTransitionTime":"2026-03-20T07:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.416172 4749 scope.go:117] "RemoveContainer" containerID="f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8" Mar 20 07:14:18 crc kubenswrapper[4749]: E0320 07:14:18.416452 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.417317 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.417394 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.417339 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.417397 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:18 crc kubenswrapper[4749]: E0320 07:14:18.417743 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:18 crc kubenswrapper[4749]: E0320 07:14:18.419112 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:18 crc kubenswrapper[4749]: E0320 07:14:18.422235 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:18 crc kubenswrapper[4749]: E0320 07:14:18.422431 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.423967 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.423992 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.424001 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.424014 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.424024 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:18Z","lastTransitionTime":"2026-03-20T07:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.427210 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdgcw_2153d97b-a108-49f8-b6c8-8223ea65b878/ovnkube-controller/1.log" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.431303 4749 scope.go:117] "RemoveContainer" containerID="ee0e8c7afc39cdbcfdfb3a65e4f608334b574c4c5a19bc64de6ad9347174b9b3" Mar 20 07:14:18 crc kubenswrapper[4749]: E0320 07:14:18.431530 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tdgcw_openshift-ovn-kubernetes(2153d97b-a108-49f8-b6c8-8223ea65b878)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.434943 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.445867 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:18Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.461687 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:18Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.481997 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:18Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.499191 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:18Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.521064 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19e11d475744ecbce4f3285124657c66590dc339fe1af7d863b19d129ca09bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:18Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.527369 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.527428 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.527445 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.527471 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.527489 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:18Z","lastTransitionTime":"2026-03-20T07:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.536648 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:18Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.552035 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:18Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.568368 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:18Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.582824 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:18Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.597479 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:18Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.611741 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:18Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.630472 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.630506 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.630513 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.630526 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.630535 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:18Z","lastTransitionTime":"2026-03-20T07:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.642378 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee0e8c7afc39cdbcfdfb3a65e4f608334b574c4c5a19bc64de6ad9347174b9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee0e8c7afc39cdbcfdfb3a65e4f608334b574c4c5a19bc64de6ad9347174b9b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"message\\\":\\\"controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z]\\\\nI0320 07:14:15.792463 6747 services_controller.go:434] Service openshift-kube-apiserver-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-kube-apiserver-operator 70a45401-9850-413a-87c2-e90a7258374e 4267 0 2025-02-23 05:12:37 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:kube-apiserver-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:kube-apiserver-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00761a26b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tdgcw_openshift-ovn-kubernetes(2153d97b-a108-49f8-b6c8-8223ea65b878)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:18Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.672063 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:18Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.688197 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:18Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.705970 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:18Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.717174 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:18Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.733403 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.733457 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.733467 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.733481 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.733490 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:18Z","lastTransitionTime":"2026-03-20T07:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.838697 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.838745 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.838757 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.838781 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.838798 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:18Z","lastTransitionTime":"2026-03-20T07:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.941549 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.941624 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.941650 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.941680 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:18 crc kubenswrapper[4749]: I0320 07:14:18.941702 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:18Z","lastTransitionTime":"2026-03-20T07:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.045101 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.045146 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.045155 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.045170 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.045184 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:19Z","lastTransitionTime":"2026-03-20T07:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.147730 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.147756 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.147764 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.147777 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.147786 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:19Z","lastTransitionTime":"2026-03-20T07:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.250357 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.250407 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.250418 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.250438 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.250452 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:19Z","lastTransitionTime":"2026-03-20T07:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.352531 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.352580 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.352597 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.352617 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.352633 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:19Z","lastTransitionTime":"2026-03-20T07:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.454492 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.454529 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.454567 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.454585 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.454598 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:19Z","lastTransitionTime":"2026-03-20T07:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.558040 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.558094 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.558109 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.558130 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.558148 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:19Z","lastTransitionTime":"2026-03-20T07:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.661375 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.661416 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.661428 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.661447 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.661459 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:19Z","lastTransitionTime":"2026-03-20T07:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.763686 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.763733 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.763745 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.763762 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.763773 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:19Z","lastTransitionTime":"2026-03-20T07:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.865576 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.865637 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.865648 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.865660 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.865668 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:19Z","lastTransitionTime":"2026-03-20T07:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.932980 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.933054 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.933107 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:19 crc kubenswrapper[4749]: E0320 07:14:19.933171 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 07:14:19 crc kubenswrapper[4749]: E0320 07:14:19.933242 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:14:35.933209451 +0000 UTC m=+112.482867148 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:14:19 crc kubenswrapper[4749]: E0320 07:14:19.933297 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:35.933268702 +0000 UTC m=+112.482926349 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 07:14:19 crc kubenswrapper[4749]: E0320 07:14:19.933331 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 07:14:19 crc kubenswrapper[4749]: E0320 07:14:19.933424 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:35.933404905 +0000 UTC m=+112.483062582 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.968150 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.968183 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.968193 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.968209 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:19 crc kubenswrapper[4749]: I0320 07:14:19.968222 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:19Z","lastTransitionTime":"2026-03-20T07:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.034696 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs\") pod \"network-metrics-daemon-k56zh\" (UID: \"6d19b89e-d048-4656-b5ce-c637190ab678\") " pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:20 crc kubenswrapper[4749]: E0320 07:14:20.034995 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 07:14:20 crc kubenswrapper[4749]: E0320 07:14:20.035030 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 07:14:20 crc kubenswrapper[4749]: E0320 07:14:20.035057 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:20 crc kubenswrapper[4749]: E0320 07:14:20.035158 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:36.035130439 +0000 UTC m=+112.584788126 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:20 crc kubenswrapper[4749]: E0320 07:14:20.035829 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.034791 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:20 crc kubenswrapper[4749]: E0320 07:14:20.035963 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs podName:6d19b89e-d048-4656-b5ce-c637190ab678 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:36.035939518 +0000 UTC m=+112.585597175 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs") pod "network-metrics-daemon-k56zh" (UID: "6d19b89e-d048-4656-b5ce-c637190ab678") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.036008 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:20 crc kubenswrapper[4749]: E0320 07:14:20.036207 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 07:14:20 crc kubenswrapper[4749]: E0320 07:14:20.036235 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 07:14:20 crc kubenswrapper[4749]: E0320 07:14:20.036258 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:20 crc kubenswrapper[4749]: E0320 07:14:20.036344 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 07:14:36.036325687 +0000 UTC m=+112.585983354 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.071106 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.071177 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.071195 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.071220 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.071248 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:20Z","lastTransitionTime":"2026-03-20T07:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.174565 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.174613 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.174625 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.174643 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.174654 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:20Z","lastTransitionTime":"2026-03-20T07:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.177117 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.177190 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.177266 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:20 crc kubenswrapper[4749]: E0320 07:14:20.177476 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.177471 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:20 crc kubenswrapper[4749]: E0320 07:14:20.177607 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:20 crc kubenswrapper[4749]: E0320 07:14:20.177665 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:20 crc kubenswrapper[4749]: E0320 07:14:20.177724 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.277375 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.277424 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.277436 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.277456 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.277471 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:20Z","lastTransitionTime":"2026-03-20T07:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.379757 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.379823 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.379842 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.379867 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.379887 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:20Z","lastTransitionTime":"2026-03-20T07:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.482803 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.482867 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.482885 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.482908 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.482924 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:20Z","lastTransitionTime":"2026-03-20T07:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.585901 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.585963 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.585979 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.586006 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.586022 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:20Z","lastTransitionTime":"2026-03-20T07:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.688698 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.688732 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.688743 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.688759 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.688771 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:20Z","lastTransitionTime":"2026-03-20T07:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.791576 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.791643 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.791656 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.791678 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.791692 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:20Z","lastTransitionTime":"2026-03-20T07:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.894118 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.894190 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.894209 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.894234 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.894251 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:20Z","lastTransitionTime":"2026-03-20T07:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.997748 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.998073 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.998211 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.998379 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:20 crc kubenswrapper[4749]: I0320 07:14:20.998501 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:20Z","lastTransitionTime":"2026-03-20T07:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.101193 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.101238 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.101251 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.101271 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.101309 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:21Z","lastTransitionTime":"2026-03-20T07:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.204407 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.204599 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.204693 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.204770 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.204850 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:21Z","lastTransitionTime":"2026-03-20T07:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.308375 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.308676 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.308787 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.308917 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.309025 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:21Z","lastTransitionTime":"2026-03-20T07:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.411646 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.412081 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.412335 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.412590 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.412788 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:21Z","lastTransitionTime":"2026-03-20T07:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.516416 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.516470 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.516484 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.516506 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.516521 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:21Z","lastTransitionTime":"2026-03-20T07:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.619543 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.619884 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.620013 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.620137 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.620336 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:21Z","lastTransitionTime":"2026-03-20T07:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.722451 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.722798 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.722934 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.723065 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.723194 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:21Z","lastTransitionTime":"2026-03-20T07:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.826324 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.826389 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.826408 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.826436 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.826454 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:21Z","lastTransitionTime":"2026-03-20T07:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.929119 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.929194 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.929221 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.929250 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:21 crc kubenswrapper[4749]: I0320 07:14:21.929308 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:21Z","lastTransitionTime":"2026-03-20T07:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.032000 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.032066 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.032083 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.032108 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.032125 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:22Z","lastTransitionTime":"2026-03-20T07:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.135100 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.138373 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.138386 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.138412 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.138422 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:22Z","lastTransitionTime":"2026-03-20T07:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.176861 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.176964 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:22 crc kubenswrapper[4749]: E0320 07:14:22.177073 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:22 crc kubenswrapper[4749]: E0320 07:14:22.177220 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.177706 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:22 crc kubenswrapper[4749]: E0320 07:14:22.177875 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.178040 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:22 crc kubenswrapper[4749]: E0320 07:14:22.178199 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.240756 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.240802 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.240816 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.240835 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.240849 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:22Z","lastTransitionTime":"2026-03-20T07:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.344238 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.344307 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.344325 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.344344 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.344358 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:22Z","lastTransitionTime":"2026-03-20T07:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.447223 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.447273 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.447321 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.447345 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.447362 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:22Z","lastTransitionTime":"2026-03-20T07:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.550653 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.550728 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.550754 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.550785 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.550810 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:22Z","lastTransitionTime":"2026-03-20T07:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.653304 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.653335 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.653345 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.653359 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.653369 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:22Z","lastTransitionTime":"2026-03-20T07:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.756192 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.756235 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.756251 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.756273 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.756328 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:22Z","lastTransitionTime":"2026-03-20T07:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.859423 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.859469 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.859486 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.859510 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.859527 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:22Z","lastTransitionTime":"2026-03-20T07:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.961879 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.961929 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.961946 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.961968 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:22 crc kubenswrapper[4749]: I0320 07:14:22.961985 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:22Z","lastTransitionTime":"2026-03-20T07:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.065529 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.065598 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.065616 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.065646 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.065666 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:23Z","lastTransitionTime":"2026-03-20T07:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.169005 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.169388 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.169589 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.169829 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.170124 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:23Z","lastTransitionTime":"2026-03-20T07:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.274241 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.274346 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.274374 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.274403 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.274425 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:23Z","lastTransitionTime":"2026-03-20T07:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.377456 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.377511 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.377527 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.377550 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.377566 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:23Z","lastTransitionTime":"2026-03-20T07:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.480159 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.480239 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.480261 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.480325 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.480350 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:23Z","lastTransitionTime":"2026-03-20T07:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.582970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.583032 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.583064 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.583106 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.583129 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:23Z","lastTransitionTime":"2026-03-20T07:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.686463 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.686528 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.686545 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.686572 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.686590 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:23Z","lastTransitionTime":"2026-03-20T07:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.789186 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.789254 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.789273 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.789327 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.789344 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:23Z","lastTransitionTime":"2026-03-20T07:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.892091 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.892155 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.892168 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.892188 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.892200 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:23Z","lastTransitionTime":"2026-03-20T07:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.994221 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.994274 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.994312 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.994332 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:23 crc kubenswrapper[4749]: I0320 07:14:23.994350 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:23Z","lastTransitionTime":"2026-03-20T07:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.096669 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.096731 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.096748 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.096775 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.096793 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:24Z","lastTransitionTime":"2026-03-20T07:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.176524 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:24 crc kubenswrapper[4749]: E0320 07:14:24.176720 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.177405 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.177450 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:24 crc kubenswrapper[4749]: E0320 07:14:24.178557 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.178709 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:24 crc kubenswrapper[4749]: E0320 07:14:24.178928 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:24 crc kubenswrapper[4749]: E0320 07:14:24.179144 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.199595 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.199644 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.199662 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.199685 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.199702 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:24Z","lastTransitionTime":"2026-03-20T07:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.201847 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:24Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.215001 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1384b577-c860-43e3-927f-3aa6d9eaadbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db71fc201b999f26a4841d7cff88cd6c415d1a2ad4920d354ed394ac8ad2982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:24Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.231586 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:24Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.249971 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:24Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.265743 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:24Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.281727 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:24Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.301205 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee0e8c7afc39cdbcfdfb3a65e4f608334b574c4c5a19bc64de6ad9347174b9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee0e8c7afc39cdbcfdfb3a65e4f608334b574c4c5a19bc64de6ad9347174b9b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"message\\\":\\\"controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z]\\\\nI0320 07:14:15.792463 6747 services_controller.go:434] Service openshift-kube-apiserver-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-kube-apiserver-operator 70a45401-9850-413a-87c2-e90a7258374e 4267 0 2025-02-23 05:12:37 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:kube-apiserver-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:kube-apiserver-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00761a26b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tdgcw_openshift-ovn-kubernetes(2153d97b-a108-49f8-b6c8-8223ea65b878)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:24Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.301965 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.302011 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.302063 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.302091 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.302111 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:24Z","lastTransitionTime":"2026-03-20T07:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.316062 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:24Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.331088 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:24Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.347672 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:24Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.366972 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:24Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.385321 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19e11d475744ecbce4f3285124657c66590dc339fe1af7d863b19d129ca09bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:24Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.404054 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:24Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.406396 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.406454 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.406479 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.406511 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.406533 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:24Z","lastTransitionTime":"2026-03-20T07:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.421389 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:24Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.440809 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:24Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.461822 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:24Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.477788 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:24Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.508759 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.508806 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.508823 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.508845 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.508862 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:24Z","lastTransitionTime":"2026-03-20T07:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.611445 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.611486 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.611500 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.611520 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.611532 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:24Z","lastTransitionTime":"2026-03-20T07:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.714164 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.714197 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.714206 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.714220 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.714231 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:24Z","lastTransitionTime":"2026-03-20T07:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.817882 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.817946 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.817964 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.817993 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.818013 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:24Z","lastTransitionTime":"2026-03-20T07:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.920987 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.921054 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.921078 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.921110 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:24 crc kubenswrapper[4749]: I0320 07:14:24.921130 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:24Z","lastTransitionTime":"2026-03-20T07:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.023531 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.023559 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.023567 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.023581 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.023590 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:25Z","lastTransitionTime":"2026-03-20T07:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.127338 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.127402 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.127424 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.127454 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.127480 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:25Z","lastTransitionTime":"2026-03-20T07:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.136883 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.137006 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.137039 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.137077 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.137116 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:25Z","lastTransitionTime":"2026-03-20T07:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:25 crc kubenswrapper[4749]: E0320 07:14:25.159872 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:25Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.164830 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.164927 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.164949 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.164973 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.164991 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:25Z","lastTransitionTime":"2026-03-20T07:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:25 crc kubenswrapper[4749]: E0320 07:14:25.187192 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:25Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.192774 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.192833 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.192856 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.192882 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.192902 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:25Z","lastTransitionTime":"2026-03-20T07:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:25 crc kubenswrapper[4749]: E0320 07:14:25.212045 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:25Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.216965 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.217040 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.217062 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.217095 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.217118 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:25Z","lastTransitionTime":"2026-03-20T07:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:25 crc kubenswrapper[4749]: E0320 07:14:25.237377 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:25Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.242617 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.242679 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.242699 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.242734 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.242752 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:25Z","lastTransitionTime":"2026-03-20T07:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:25 crc kubenswrapper[4749]: E0320 07:14:25.270689 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:25Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:25 crc kubenswrapper[4749]: E0320 07:14:25.270833 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.272665 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.272698 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.272710 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.272725 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.272738 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:25Z","lastTransitionTime":"2026-03-20T07:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.375856 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.375932 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.375954 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.375983 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.376006 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:25Z","lastTransitionTime":"2026-03-20T07:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.479085 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.479158 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.479183 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.479212 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.479236 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:25Z","lastTransitionTime":"2026-03-20T07:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.582496 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.582557 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.582575 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.582599 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.582616 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:25Z","lastTransitionTime":"2026-03-20T07:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.685646 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.685689 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.685700 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.685714 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.685724 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:25Z","lastTransitionTime":"2026-03-20T07:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.788852 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.788907 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.788918 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.788938 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.788952 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:25Z","lastTransitionTime":"2026-03-20T07:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.891163 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.891227 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.891244 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.891271 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.891324 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:25Z","lastTransitionTime":"2026-03-20T07:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.994466 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.994530 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.994551 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.994586 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:25 crc kubenswrapper[4749]: I0320 07:14:25.994610 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:25Z","lastTransitionTime":"2026-03-20T07:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.097559 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.097626 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.097644 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.097670 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.097687 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:26Z","lastTransitionTime":"2026-03-20T07:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.176560 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.176658 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.176708 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:26 crc kubenswrapper[4749]: E0320 07:14:26.176704 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.176665 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:26 crc kubenswrapper[4749]: E0320 07:14:26.176838 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:26 crc kubenswrapper[4749]: E0320 07:14:26.176950 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:26 crc kubenswrapper[4749]: E0320 07:14:26.177177 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.200012 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.200074 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.200094 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.200120 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.200140 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:26Z","lastTransitionTime":"2026-03-20T07:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.304076 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.304132 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.304150 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.304174 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.304191 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:26Z","lastTransitionTime":"2026-03-20T07:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.407176 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.407248 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.407269 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.407336 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.407360 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:26Z","lastTransitionTime":"2026-03-20T07:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.509994 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.510028 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.510039 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.510053 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.510063 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:26Z","lastTransitionTime":"2026-03-20T07:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.612386 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.612439 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.612463 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.612492 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.612513 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:26Z","lastTransitionTime":"2026-03-20T07:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.715348 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.715767 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.715895 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.716037 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.716154 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:26Z","lastTransitionTime":"2026-03-20T07:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.818834 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.818861 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.818868 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.818880 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.818888 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:26Z","lastTransitionTime":"2026-03-20T07:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.922035 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.922068 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.922078 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.922095 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:26 crc kubenswrapper[4749]: I0320 07:14:26.922106 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:26Z","lastTransitionTime":"2026-03-20T07:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.025747 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.025805 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.025827 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.025857 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.025874 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:27Z","lastTransitionTime":"2026-03-20T07:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.128124 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.128275 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.128335 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.128368 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.128390 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:27Z","lastTransitionTime":"2026-03-20T07:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.230640 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.230719 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.230747 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.230777 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.230803 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:27Z","lastTransitionTime":"2026-03-20T07:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.333839 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.333885 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.333896 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.333913 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.333925 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:27Z","lastTransitionTime":"2026-03-20T07:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.437903 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.437961 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.437980 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.438006 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.438023 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:27Z","lastTransitionTime":"2026-03-20T07:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.540382 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.540467 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.540525 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.540556 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.540580 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:27Z","lastTransitionTime":"2026-03-20T07:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.644074 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.644132 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.644150 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.644176 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.644192 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:27Z","lastTransitionTime":"2026-03-20T07:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.747454 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.747522 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.747541 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.747566 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.747583 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:27Z","lastTransitionTime":"2026-03-20T07:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.850802 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.850860 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.850877 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.850903 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.850923 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:27Z","lastTransitionTime":"2026-03-20T07:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.954444 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.954508 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.954526 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.954549 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:27 crc kubenswrapper[4749]: I0320 07:14:27.954565 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:27Z","lastTransitionTime":"2026-03-20T07:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.057473 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.057571 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.057598 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.057630 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.057657 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:28Z","lastTransitionTime":"2026-03-20T07:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.160885 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.161001 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.161020 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.161044 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.161061 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:28Z","lastTransitionTime":"2026-03-20T07:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.176528 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.176618 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.176622 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.176744 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:28 crc kubenswrapper[4749]: E0320 07:14:28.176731 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:28 crc kubenswrapper[4749]: E0320 07:14:28.176909 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:28 crc kubenswrapper[4749]: E0320 07:14:28.177025 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:28 crc kubenswrapper[4749]: E0320 07:14:28.177209 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.264339 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.264404 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.264427 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.264457 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.264477 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:28Z","lastTransitionTime":"2026-03-20T07:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.367359 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.367417 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.367434 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.367457 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.367475 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:28Z","lastTransitionTime":"2026-03-20T07:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.470386 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.470452 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.470474 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.470505 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.470532 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:28Z","lastTransitionTime":"2026-03-20T07:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.573449 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.573518 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.573540 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.573568 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.573589 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:28Z","lastTransitionTime":"2026-03-20T07:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.675652 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.675692 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.675704 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.675717 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.675725 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:28Z","lastTransitionTime":"2026-03-20T07:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.778810 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.778852 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.778864 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.778879 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.778891 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:28Z","lastTransitionTime":"2026-03-20T07:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.881231 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.881331 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.881353 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.881377 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.881395 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:28Z","lastTransitionTime":"2026-03-20T07:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.984329 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.984379 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.984396 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.984420 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:28 crc kubenswrapper[4749]: I0320 07:14:28.984437 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:28Z","lastTransitionTime":"2026-03-20T07:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.086515 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.086583 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.086607 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.086636 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.086656 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:29Z","lastTransitionTime":"2026-03-20T07:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.190087 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.190159 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.190185 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.190212 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.190232 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:29Z","lastTransitionTime":"2026-03-20T07:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.293251 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.293342 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.293367 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.293394 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.293413 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:29Z","lastTransitionTime":"2026-03-20T07:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.396424 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.396542 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.396564 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.396592 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.396607 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:29Z","lastTransitionTime":"2026-03-20T07:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.499500 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.499583 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.499605 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.499634 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.499657 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:29Z","lastTransitionTime":"2026-03-20T07:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.603214 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.603334 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.603363 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.603392 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.603414 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:29Z","lastTransitionTime":"2026-03-20T07:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.706865 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.706949 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.706970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.706995 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.707013 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:29Z","lastTransitionTime":"2026-03-20T07:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.810746 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.811169 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.811522 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.811700 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.811840 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:29Z","lastTransitionTime":"2026-03-20T07:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.914787 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.914840 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.914856 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.914879 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:29 crc kubenswrapper[4749]: I0320 07:14:29.914895 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:29Z","lastTransitionTime":"2026-03-20T07:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.017699 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.017772 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.017806 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.017837 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.017858 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:30Z","lastTransitionTime":"2026-03-20T07:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.121602 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.121672 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.121688 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.121712 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.121733 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:30Z","lastTransitionTime":"2026-03-20T07:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.176202 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.176318 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.176202 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.176420 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:30 crc kubenswrapper[4749]: E0320 07:14:30.176606 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:30 crc kubenswrapper[4749]: E0320 07:14:30.176762 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:30 crc kubenswrapper[4749]: E0320 07:14:30.177012 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:30 crc kubenswrapper[4749]: E0320 07:14:30.177495 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.225362 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.225423 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.225440 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.225462 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.225481 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:30Z","lastTransitionTime":"2026-03-20T07:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.328536 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.328630 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.328651 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.328675 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.328692 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:30Z","lastTransitionTime":"2026-03-20T07:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.431915 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.431962 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.431982 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.432007 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.432027 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:30Z","lastTransitionTime":"2026-03-20T07:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.534691 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.534761 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.534777 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.534803 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.534821 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:30Z","lastTransitionTime":"2026-03-20T07:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.637689 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.637747 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.637759 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.638231 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.638245 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:30Z","lastTransitionTime":"2026-03-20T07:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.741814 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.741867 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.741883 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.741904 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.741921 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:30Z","lastTransitionTime":"2026-03-20T07:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.850044 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.850117 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.850140 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.850170 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.850274 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:30Z","lastTransitionTime":"2026-03-20T07:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.953743 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.953811 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.953830 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.953855 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:30 crc kubenswrapper[4749]: I0320 07:14:30.953873 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:30Z","lastTransitionTime":"2026-03-20T07:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.057498 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.057564 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.057580 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.057604 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.057621 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:31Z","lastTransitionTime":"2026-03-20T07:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.159949 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.160019 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.160037 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.160064 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.160081 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:31Z","lastTransitionTime":"2026-03-20T07:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.177704 4749 scope.go:117] "RemoveContainer" containerID="ee0e8c7afc39cdbcfdfb3a65e4f608334b574c4c5a19bc64de6ad9347174b9b3" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.263156 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.263536 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.263555 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.263579 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.263596 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:31Z","lastTransitionTime":"2026-03-20T07:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.365757 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.365792 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.365804 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.365821 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.365833 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:31Z","lastTransitionTime":"2026-03-20T07:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.469152 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.469219 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.469241 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.469269 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.469323 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:31Z","lastTransitionTime":"2026-03-20T07:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.478707 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdgcw_2153d97b-a108-49f8-b6c8-8223ea65b878/ovnkube-controller/1.log" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.483071 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerStarted","Data":"4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e"} Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.483834 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.500039 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:31Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.515617 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:31Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.540410 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:31Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.553886 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:31Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.572050 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.572111 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.572124 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.572141 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.572152 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:31Z","lastTransitionTime":"2026-03-20T07:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.572610 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19e11d475744ecbce4f3285124657c66590dc339fe1af7d863b19d129ca09bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:31Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.588464 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:31Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.604053 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:31Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.625259 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:31Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.648517 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:31Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.662493 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:31Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.674193 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.674233 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.674241 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.674255 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.674266 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:31Z","lastTransitionTime":"2026-03-20T07:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.688233 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee0e8c7afc39cdbcfdfb3a65e4f608334b574c4c5a19bc64de6ad9347174b9b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"message\\\":\\\"controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z]\\\\nI0320 07:14:15.792463 6747 services_controller.go:434] Service openshift-kube-apiserver-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-kube-apiserver-operator 70a45401-9850-413a-87c2-e90a7258374e 4267 0 2025-02-23 05:12:37 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:kube-apiserver-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:kube-apiserver-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00761a26b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:31Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.709610 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:31Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.729304 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1384b577-c860-43e3-927f-3aa6d9eaadbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db71fc201b999f26a4841d7cff88cd6c415d1a2ad4920d354ed394ac8ad2982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:31Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.743429 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:31Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.755158 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:31Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.766567 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:31Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.776644 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.776725 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.776745 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.776771 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.776789 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:31Z","lastTransitionTime":"2026-03-20T07:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.779993 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:31Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.878674 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.878722 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.878733 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.878750 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.878763 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:31Z","lastTransitionTime":"2026-03-20T07:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.980898 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.980934 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.980942 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.980956 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:31 crc kubenswrapper[4749]: I0320 07:14:31.980965 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:31Z","lastTransitionTime":"2026-03-20T07:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.083577 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.083650 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.083671 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.083698 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.083716 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:32Z","lastTransitionTime":"2026-03-20T07:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.176675 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.176731 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.176651 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.176933 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:32 crc kubenswrapper[4749]: E0320 07:14:32.177125 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:32 crc kubenswrapper[4749]: E0320 07:14:32.177351 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:32 crc kubenswrapper[4749]: E0320 07:14:32.177472 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.177522 4749 scope.go:117] "RemoveContainer" containerID="f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8" Mar 20 07:14:32 crc kubenswrapper[4749]: E0320 07:14:32.177803 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.186967 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.187014 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.187033 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.187054 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.187072 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:32Z","lastTransitionTime":"2026-03-20T07:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.289713 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.289798 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.289822 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.289853 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.289878 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:32Z","lastTransitionTime":"2026-03-20T07:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.393249 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.393314 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.393326 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.393346 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.393359 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:32Z","lastTransitionTime":"2026-03-20T07:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.490180 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.492810 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"688e8fa067ea553fac09be724c46f16706c8b3463f09d6a4e2cfe3212027da17"} Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.493798 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.495406 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.495441 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.495457 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.495474 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.495488 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:32Z","lastTransitionTime":"2026-03-20T07:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.497329 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdgcw_2153d97b-a108-49f8-b6c8-8223ea65b878/ovnkube-controller/2.log" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.498714 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdgcw_2153d97b-a108-49f8-b6c8-8223ea65b878/ovnkube-controller/1.log" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.505540 4749 generic.go:334] "Generic (PLEG): container finished" podID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerID="4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e" exitCode=1 Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.505769 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerDied","Data":"4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e"} Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.506015 4749 scope.go:117] "RemoveContainer" containerID="ee0e8c7afc39cdbcfdfb3a65e4f608334b574c4c5a19bc64de6ad9347174b9b3" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.506760 4749 scope.go:117] "RemoveContainer" containerID="4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e" Mar 20 07:14:32 crc kubenswrapper[4749]: E0320 07:14:32.507178 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tdgcw_openshift-ovn-kubernetes(2153d97b-a108-49f8-b6c8-8223ea65b878)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.510743 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.524982 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.539763 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.551172 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1384b577-c860-43e3-927f-3aa6d9eaadbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db71fc201b999f26a4841d7cff88cd6c415d1a2ad4920d354ed394ac8ad2982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.567135 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.582137 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.595446 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.598133 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.598387 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.598524 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.598678 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.598824 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:32Z","lastTransitionTime":"2026-03-20T07:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.613920 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.634352 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee0e8c7afc39cdbcfdfb3a65e4f608334b574c4c5a19bc64de6ad9347174b9b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"message\\\":\\\"controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z]\\\\nI0320 07:14:15.792463 6747 services_controller.go:434] Service openshift-kube-apiserver-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-kube-apiserver-operator 70a45401-9850-413a-87c2-e90a7258374e 4267 0 2025-02-23 05:12:37 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:kube-apiserver-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:kube-apiserver-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00761a26b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.664255 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.679096 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.696982 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.701570 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.701629 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.701639 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.701658 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.701667 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:32Z","lastTransitionTime":"2026-03-20T07:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.711645 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://688e8fa067ea553fac09be724c46f16706c8b3463f09d6a4e2cfe3212027da17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.726347 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.743705 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19e11d475744ecbce4f3285124657c66590dc339fe1af7d863b19d129ca09bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.755429 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.767982 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.780640 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.791382 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.804036 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.804103 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.804122 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.804094 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.804147 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.804265 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:32Z","lastTransitionTime":"2026-03-20T07:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.824735 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee0e8c7afc39cdbcfdfb3a65e4f608334b574c4c5a19bc64de6ad9347174b9b3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:15Z\\\",\\\"message\\\":\\\"controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:15Z is after 2025-08-24T17:21:41Z]\\\\nI0320 07:14:15.792463 6747 services_controller.go:434] Service openshift-kube-apiserver-operator/metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{metrics openshift-kube-apiserver-operator 70a45401-9850-413a-87c2-e90a7258374e 4267 0 2025-02-23 05:12:37 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app:kube-apiserver-operator] map[exclude.release.openshift.io/internal-openshift-hosted:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:kube-apiserver-operator-serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00761a26b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:32Z\\\",\\\"message\\\":\\\"od event handler 3\\\\nI0320 07:14:32.081966 6927 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 07:14:32.082007 6927 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 07:14:32.081939 6927 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 07:14:32.082053 6927 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 07:14:32.082092 6927 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 07:14:32.082103 6927 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 07:14:32.082153 6927 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 07:14:32.082169 6927 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 07:14:32.082188 6927 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 07:14:32.082205 6927 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 07:14:32.082223 6927 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 07:14:32.082225 6927 factory.go:656] Stopping watch factory\\\\nI0320 07:14:32.082255 6927 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 07:14:32.082358 6927 ovnkube.go:599] Stopped ovnkube\\\\nI0320 07:14:32.082420 6927 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 07:14:32.082513 6927 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.852737 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.867521 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1384b577-c860-43e3-927f-3aa6d9eaadbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db71fc201b999f26a4841d7cff88cd6c415d1a2ad4920d354ed394ac8ad2982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.883753 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.899709 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.906933 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.906965 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.906978 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.906997 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.907011 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:32Z","lastTransitionTime":"2026-03-20T07:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.915047 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.928599 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.941149 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.955434 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://688e8fa067ea553fac09be724c46f16706c8b3463f09d6a4e2cfe3212027da17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.969114 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:32 crc kubenswrapper[4749]: I0320 07:14:32.986010 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19e11d475744ecbce4f3285124657c66590dc339fe1af7d863b19d129ca09bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:32.999994 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:32Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.009021 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.009072 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.009082 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.009094 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.009102 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:33Z","lastTransitionTime":"2026-03-20T07:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.013756 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:33Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.026401 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:33Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.112032 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.112078 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.112094 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.112118 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.112136 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:33Z","lastTransitionTime":"2026-03-20T07:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.213951 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.213983 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.213994 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.214010 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.214021 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:33Z","lastTransitionTime":"2026-03-20T07:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.317037 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.317076 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.317093 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.317115 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.317131 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:33Z","lastTransitionTime":"2026-03-20T07:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.418891 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.419150 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.419247 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.419366 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.419458 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:33Z","lastTransitionTime":"2026-03-20T07:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.511907 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdgcw_2153d97b-a108-49f8-b6c8-8223ea65b878/ovnkube-controller/2.log" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.519378 4749 scope.go:117] "RemoveContainer" containerID="4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e" Mar 20 07:14:33 crc kubenswrapper[4749]: E0320 07:14:33.519699 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tdgcw_openshift-ovn-kubernetes(2153d97b-a108-49f8-b6c8-8223ea65b878)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.520980 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.521021 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.521079 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.521246 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.521311 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:33Z","lastTransitionTime":"2026-03-20T07:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.536538 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:33Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.548983 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:33Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.562618 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:33Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.584706 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:32Z\\\",\\\"message\\\":\\\"od event handler 3\\\\nI0320 07:14:32.081966 6927 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 07:14:32.082007 6927 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 07:14:32.081939 6927 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 07:14:32.082053 6927 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 07:14:32.082092 6927 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 07:14:32.082103 6927 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 07:14:32.082153 6927 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 07:14:32.082169 6927 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 07:14:32.082188 6927 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 07:14:32.082205 6927 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 07:14:32.082223 6927 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 07:14:32.082225 6927 factory.go:656] Stopping watch factory\\\\nI0320 07:14:32.082255 6927 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 07:14:32.082358 6927 ovnkube.go:599] Stopped ovnkube\\\\nI0320 07:14:32.082420 6927 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 07:14:32.082513 6927 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tdgcw_openshift-ovn-kubernetes(2153d97b-a108-49f8-b6c8-8223ea65b878)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:33Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.612844 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:33Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.623576 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.623648 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.623673 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.623702 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.623725 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:33Z","lastTransitionTime":"2026-03-20T07:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.625699 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1384b577-c860-43e3-927f-3aa6d9eaadbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db71fc201b999f26a4841d7cff88cd6c415d1a2ad4920d354ed394ac8ad2982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:33Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.642816 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:33Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.658608 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:33Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.676511 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:33Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.690812 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:33Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.705723 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:33Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.719499 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://688e8fa067ea553fac09be724c46f16706c8b3463f09d6a4e2cfe3212027da17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:33Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.727175 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.727235 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.727252 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.727274 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.727319 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:33Z","lastTransitionTime":"2026-03-20T07:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.733706 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:33Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.749627 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19e11d475744ecbce4f3285124657c66590dc339fe1af7d863b19d129ca09bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:33Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.762012 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:33Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.779940 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:33Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.794128 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:33Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.830394 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.830437 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.830451 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.830468 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.830480 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:33Z","lastTransitionTime":"2026-03-20T07:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.933904 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.933975 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.933998 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.934026 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:33 crc kubenswrapper[4749]: I0320 07:14:33.934048 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:33Z","lastTransitionTime":"2026-03-20T07:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.037048 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.037121 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.037145 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.037173 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.037192 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:34Z","lastTransitionTime":"2026-03-20T07:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.140111 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.140186 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.140209 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.140239 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.140260 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:34Z","lastTransitionTime":"2026-03-20T07:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.176336 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.176501 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:34 crc kubenswrapper[4749]: E0320 07:14:34.176658 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.176720 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.176781 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:34 crc kubenswrapper[4749]: E0320 07:14:34.176904 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:34 crc kubenswrapper[4749]: E0320 07:14:34.177054 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:34 crc kubenswrapper[4749]: E0320 07:14:34.177187 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.196776 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:34Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.218188 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:34Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.239392 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:34Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.244490 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.244552 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.244574 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.244602 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.244623 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:34Z","lastTransitionTime":"2026-03-20T07:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.258349 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1384b577-c860-43e3-927f-3aa6d9eaadbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db71fc201b999f26a4841d7cff88cd6c415d1a2ad4920d354ed394ac8ad2982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:34Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.280777 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:34Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.301943 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:34Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.325938 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:34Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.347710 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.347749 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.347761 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.347777 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.347789 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:34Z","lastTransitionTime":"2026-03-20T07:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.348073 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:34Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.376685 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:32Z\\\",\\\"message\\\":\\\"od event handler 3\\\\nI0320 07:14:32.081966 6927 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 07:14:32.082007 6927 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 07:14:32.081939 6927 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 07:14:32.082053 6927 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 07:14:32.082092 6927 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 07:14:32.082103 6927 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 07:14:32.082153 6927 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 07:14:32.082169 6927 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 07:14:32.082188 6927 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 07:14:32.082205 6927 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 07:14:32.082223 6927 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 07:14:32.082225 6927 factory.go:656] Stopping watch factory\\\\nI0320 07:14:32.082255 6927 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 07:14:32.082358 6927 ovnkube.go:599] Stopped ovnkube\\\\nI0320 07:14:32.082420 6927 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 07:14:32.082513 6927 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tdgcw_openshift-ovn-kubernetes(2153d97b-a108-49f8-b6c8-8223ea65b878)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:34Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.407508 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:34Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.425689 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:34Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.442448 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:34Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.449745 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.449782 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.449792 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.449808 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.449821 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:34Z","lastTransitionTime":"2026-03-20T07:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.457730 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:34Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.476835 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19e11d475744ecbce4f3285124657c66590dc339fe1af7d863b19d129ca09bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:34Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.493269 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:34Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.510377 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:34Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.532884 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://688e8fa067ea553fac09be724c46f16706c8b3463f09d6a4e2cfe3212027da17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:34Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.552277 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.552413 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.552439 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.552472 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.552496 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:34Z","lastTransitionTime":"2026-03-20T07:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.654851 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.654910 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.654927 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.654951 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.654969 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:34Z","lastTransitionTime":"2026-03-20T07:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.759251 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.759358 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.759402 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.759427 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.759451 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:34Z","lastTransitionTime":"2026-03-20T07:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.862879 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.862927 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.862938 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.862956 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.862968 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:34Z","lastTransitionTime":"2026-03-20T07:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.966182 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.966249 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.966266 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.966327 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:34 crc kubenswrapper[4749]: I0320 07:14:34.966345 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:34Z","lastTransitionTime":"2026-03-20T07:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.068394 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.068436 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.068446 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.068461 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.068470 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:35Z","lastTransitionTime":"2026-03-20T07:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.171256 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.171342 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.171359 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.171383 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.171400 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:35Z","lastTransitionTime":"2026-03-20T07:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.274991 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.275338 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.275366 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.275394 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.275412 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:35Z","lastTransitionTime":"2026-03-20T07:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.354820 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.354932 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.354950 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.354974 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.354992 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:35Z","lastTransitionTime":"2026-03-20T07:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:35 crc kubenswrapper[4749]: E0320 07:14:35.377031 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:35Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.384194 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.384251 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.384268 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.384323 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.384341 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:35Z","lastTransitionTime":"2026-03-20T07:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:35 crc kubenswrapper[4749]: E0320 07:14:35.404979 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:35Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.409500 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.409563 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.409586 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.409622 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.409647 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:35Z","lastTransitionTime":"2026-03-20T07:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:35 crc kubenswrapper[4749]: E0320 07:14:35.430858 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:35Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.436244 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.436323 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.436340 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.436363 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.436379 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:35Z","lastTransitionTime":"2026-03-20T07:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:35 crc kubenswrapper[4749]: E0320 07:14:35.457781 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:35Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.462320 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.462385 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.462404 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.462433 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.462450 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:35Z","lastTransitionTime":"2026-03-20T07:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:35 crc kubenswrapper[4749]: E0320 07:14:35.485583 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:35Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:35 crc kubenswrapper[4749]: E0320 07:14:35.485809 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.487834 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.487920 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.487945 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.487976 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.488000 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:35Z","lastTransitionTime":"2026-03-20T07:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.591445 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.591577 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.591596 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.591621 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.591641 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:35Z","lastTransitionTime":"2026-03-20T07:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.694576 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.694635 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.694651 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.694674 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.694691 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:35Z","lastTransitionTime":"2026-03-20T07:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.797784 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.797865 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.797888 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.797913 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.797932 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:35Z","lastTransitionTime":"2026-03-20T07:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.901501 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.901555 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.901577 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.901605 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:35 crc kubenswrapper[4749]: I0320 07:14:35.901626 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:35Z","lastTransitionTime":"2026-03-20T07:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.005025 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.005087 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.005104 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.005129 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.005147 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:36Z","lastTransitionTime":"2026-03-20T07:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.025521 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.025650 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.025758 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:36 crc kubenswrapper[4749]: E0320 07:14:36.025828 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:15:08.02579076 +0000 UTC m=+144.575448447 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:14:36 crc kubenswrapper[4749]: E0320 07:14:36.025862 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 07:14:36 crc kubenswrapper[4749]: E0320 07:14:36.025934 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:08.025915293 +0000 UTC m=+144.575572980 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 07:14:36 crc kubenswrapper[4749]: E0320 07:14:36.026115 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 07:14:36 crc kubenswrapper[4749]: E0320 07:14:36.026239 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:08.026211911 +0000 UTC m=+144.575869598 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.108374 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.108459 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.108477 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.108498 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.108515 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:36Z","lastTransitionTime":"2026-03-20T07:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.127102 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.127202 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs\") pod \"network-metrics-daemon-k56zh\" (UID: \"6d19b89e-d048-4656-b5ce-c637190ab678\") " pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.127247 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:36 crc kubenswrapper[4749]: E0320 07:14:36.127439 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 07:14:36 crc kubenswrapper[4749]: E0320 07:14:36.127464 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 07:14:36 crc kubenswrapper[4749]: E0320 07:14:36.127466 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 07:14:36 crc kubenswrapper[4749]: E0320 07:14:36.127482 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:36 crc kubenswrapper[4749]: E0320 07:14:36.127473 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 07:14:36 crc kubenswrapper[4749]: E0320 07:14:36.127584 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 07:14:36 crc kubenswrapper[4749]: E0320 07:14:36.127610 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:36 crc kubenswrapper[4749]: E0320 07:14:36.127551 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:08.127530054 +0000 UTC m=+144.677187731 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:36 crc kubenswrapper[4749]: E0320 07:14:36.127681 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:08.127655527 +0000 UTC m=+144.677313234 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:14:36 crc kubenswrapper[4749]: E0320 07:14:36.127720 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs podName:6d19b89e-d048-4656-b5ce-c637190ab678 nodeName:}" failed. No retries permitted until 2026-03-20 07:15:08.127701978 +0000 UTC m=+144.677359725 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs") pod "network-metrics-daemon-k56zh" (UID: "6d19b89e-d048-4656-b5ce-c637190ab678") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.176898 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.176960 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.177017 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:36 crc kubenswrapper[4749]: E0320 07:14:36.177150 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.177210 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:36 crc kubenswrapper[4749]: E0320 07:14:36.177329 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:36 crc kubenswrapper[4749]: E0320 07:14:36.177687 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:36 crc kubenswrapper[4749]: E0320 07:14:36.177835 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.211452 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.211508 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.211526 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.211549 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.211566 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:36Z","lastTransitionTime":"2026-03-20T07:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.315322 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.315394 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.315421 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.315454 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.315476 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:36Z","lastTransitionTime":"2026-03-20T07:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.418095 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.418181 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.418198 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.418223 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.418243 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:36Z","lastTransitionTime":"2026-03-20T07:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.521592 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.521655 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.521676 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.521703 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.521723 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:36Z","lastTransitionTime":"2026-03-20T07:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.625464 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.625538 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.625556 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.625579 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.625596 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:36Z","lastTransitionTime":"2026-03-20T07:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.728638 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.728725 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.728744 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.728772 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.728790 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:36Z","lastTransitionTime":"2026-03-20T07:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.832316 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.832395 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.832419 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.832452 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.832476 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:36Z","lastTransitionTime":"2026-03-20T07:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.935970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.936075 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.936094 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.936118 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:36 crc kubenswrapper[4749]: I0320 07:14:36.936137 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:36Z","lastTransitionTime":"2026-03-20T07:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.040120 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.040190 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.040207 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.040230 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.040247 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:37Z","lastTransitionTime":"2026-03-20T07:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.143542 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.143604 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.143620 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.143643 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.143661 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:37Z","lastTransitionTime":"2026-03-20T07:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.245637 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.245702 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.245737 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.245765 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.245785 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:37Z","lastTransitionTime":"2026-03-20T07:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.348084 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.348142 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.348183 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.348210 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.348231 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:37Z","lastTransitionTime":"2026-03-20T07:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.450943 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.450993 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.451008 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.451033 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.451050 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:37Z","lastTransitionTime":"2026-03-20T07:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.554237 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.554343 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.554361 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.554383 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.554399 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:37Z","lastTransitionTime":"2026-03-20T07:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.657328 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.657380 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.657398 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.657421 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.657437 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:37Z","lastTransitionTime":"2026-03-20T07:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.760103 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.760167 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.760191 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.760219 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.760239 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:37Z","lastTransitionTime":"2026-03-20T07:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.863316 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.863383 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.863407 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.863435 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.863456 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:37Z","lastTransitionTime":"2026-03-20T07:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.966372 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.966458 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.966482 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.966511 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:37 crc kubenswrapper[4749]: I0320 07:14:37.966529 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:37Z","lastTransitionTime":"2026-03-20T07:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.069856 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.069903 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.069912 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.069927 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.069936 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:38Z","lastTransitionTime":"2026-03-20T07:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.172451 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.172553 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.172573 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.172596 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.172611 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:38Z","lastTransitionTime":"2026-03-20T07:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.177116 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.177128 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.177150 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:38 crc kubenswrapper[4749]: E0320 07:14:38.177264 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.177346 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:38 crc kubenswrapper[4749]: E0320 07:14:38.177470 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:38 crc kubenswrapper[4749]: E0320 07:14:38.177617 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:38 crc kubenswrapper[4749]: E0320 07:14:38.177697 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.276172 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.276339 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.276360 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.276383 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.276400 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:38Z","lastTransitionTime":"2026-03-20T07:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.379455 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.379543 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.379561 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.379587 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.379606 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:38Z","lastTransitionTime":"2026-03-20T07:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.482326 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.482384 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.482403 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.482426 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.482445 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:38Z","lastTransitionTime":"2026-03-20T07:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.585481 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.585556 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.585581 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.585610 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.585632 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:38Z","lastTransitionTime":"2026-03-20T07:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.688203 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.688260 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.688273 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.688309 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.688320 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:38Z","lastTransitionTime":"2026-03-20T07:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.791661 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.791721 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.791738 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.791762 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.791778 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:38Z","lastTransitionTime":"2026-03-20T07:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.895353 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.895397 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.895411 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.895430 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.895443 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:38Z","lastTransitionTime":"2026-03-20T07:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.997695 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.997760 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.997778 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.997808 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:38 crc kubenswrapper[4749]: I0320 07:14:38.997826 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:38Z","lastTransitionTime":"2026-03-20T07:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.101093 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.101169 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.101196 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.101224 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.101240 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:39Z","lastTransitionTime":"2026-03-20T07:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.204032 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.204088 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.204105 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.204130 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.204148 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:39Z","lastTransitionTime":"2026-03-20T07:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.307517 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.307596 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.307614 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.307639 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.307663 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:39Z","lastTransitionTime":"2026-03-20T07:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.410702 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.410788 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.410809 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.410868 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.410887 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:39Z","lastTransitionTime":"2026-03-20T07:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.513487 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.513574 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.513597 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.513624 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.513645 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:39Z","lastTransitionTime":"2026-03-20T07:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.616251 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.616349 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.616366 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.616409 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.616427 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:39Z","lastTransitionTime":"2026-03-20T07:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.718959 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.719022 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.719040 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.719064 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.719081 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:39Z","lastTransitionTime":"2026-03-20T07:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.822465 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.822526 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.822544 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.822570 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.822588 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:39Z","lastTransitionTime":"2026-03-20T07:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.925771 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.925840 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.925863 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.925894 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:39 crc kubenswrapper[4749]: I0320 07:14:39.925916 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:39Z","lastTransitionTime":"2026-03-20T07:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.028875 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.028938 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.028956 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.028981 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.028999 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:40Z","lastTransitionTime":"2026-03-20T07:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.132813 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.132956 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.132985 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.133015 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.133037 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:40Z","lastTransitionTime":"2026-03-20T07:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.177105 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.177145 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.177181 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:40 crc kubenswrapper[4749]: E0320 07:14:40.177262 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.177349 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:40 crc kubenswrapper[4749]: E0320 07:14:40.177456 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:40 crc kubenswrapper[4749]: E0320 07:14:40.177588 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:40 crc kubenswrapper[4749]: E0320 07:14:40.177670 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.236388 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.236436 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.236453 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.236476 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.236494 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:40Z","lastTransitionTime":"2026-03-20T07:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.338935 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.339053 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.339076 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.339101 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.339119 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:40Z","lastTransitionTime":"2026-03-20T07:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.442237 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.442423 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.442443 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.442469 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.442488 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:40Z","lastTransitionTime":"2026-03-20T07:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.545660 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.545713 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.545731 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.545757 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.545775 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:40Z","lastTransitionTime":"2026-03-20T07:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.648995 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.649039 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.649049 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.649067 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.649080 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:40Z","lastTransitionTime":"2026-03-20T07:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.751775 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.751835 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.751852 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.751877 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.751893 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:40Z","lastTransitionTime":"2026-03-20T07:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.855495 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.855573 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.855591 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.855615 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.855632 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:40Z","lastTransitionTime":"2026-03-20T07:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.959216 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.959276 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.959322 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.959346 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:40 crc kubenswrapper[4749]: I0320 07:14:40.959362 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:40Z","lastTransitionTime":"2026-03-20T07:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.062211 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.062339 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.062364 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.062395 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.062417 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:41Z","lastTransitionTime":"2026-03-20T07:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.165583 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.165649 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.165669 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.165695 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.165714 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:41Z","lastTransitionTime":"2026-03-20T07:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.268022 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.268092 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.268113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.268141 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.268162 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:41Z","lastTransitionTime":"2026-03-20T07:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.371543 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.371622 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.371645 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.371674 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.371692 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:41Z","lastTransitionTime":"2026-03-20T07:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.474257 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.474343 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.474362 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.474385 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.474402 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:41Z","lastTransitionTime":"2026-03-20T07:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.577469 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.577538 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.577560 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.577591 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.577615 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:41Z","lastTransitionTime":"2026-03-20T07:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.680766 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.680829 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.680849 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.680873 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.680891 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:41Z","lastTransitionTime":"2026-03-20T07:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.783601 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.783665 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.783687 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.783716 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.783738 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:41Z","lastTransitionTime":"2026-03-20T07:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.889449 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.889511 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.889527 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.889550 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.889567 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:41Z","lastTransitionTime":"2026-03-20T07:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.992976 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.993036 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.993054 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.993079 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:41 crc kubenswrapper[4749]: I0320 07:14:41.993097 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:41Z","lastTransitionTime":"2026-03-20T07:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.095605 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.095673 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.095694 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.095723 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.095746 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:42Z","lastTransitionTime":"2026-03-20T07:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.176948 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.177094 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:42 crc kubenswrapper[4749]: E0320 07:14:42.177153 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:42 crc kubenswrapper[4749]: E0320 07:14:42.177327 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.177373 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:42 crc kubenswrapper[4749]: E0320 07:14:42.177624 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.176965 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:42 crc kubenswrapper[4749]: E0320 07:14:42.177822 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.198728 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.198798 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.198814 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.198839 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.198856 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:42Z","lastTransitionTime":"2026-03-20T07:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.306429 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.306513 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.306533 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.306560 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.306588 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:42Z","lastTransitionTime":"2026-03-20T07:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.410158 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.410233 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.410267 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.410332 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.410355 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:42Z","lastTransitionTime":"2026-03-20T07:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.514341 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.514397 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.514415 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.514439 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.514456 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:42Z","lastTransitionTime":"2026-03-20T07:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.617511 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.617591 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.617615 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.617643 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.617665 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:42Z","lastTransitionTime":"2026-03-20T07:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.721030 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.721080 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.721097 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.721119 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.721136 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:42Z","lastTransitionTime":"2026-03-20T07:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.823966 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.824033 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.824055 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.824084 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.824105 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:42Z","lastTransitionTime":"2026-03-20T07:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.927152 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.927224 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.927244 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.927269 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:42 crc kubenswrapper[4749]: I0320 07:14:42.927322 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:42Z","lastTransitionTime":"2026-03-20T07:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.030270 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.030388 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.030412 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.030439 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.030462 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:43Z","lastTransitionTime":"2026-03-20T07:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.133726 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.133799 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.133818 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.133840 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.133856 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:43Z","lastTransitionTime":"2026-03-20T07:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.191778 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.237036 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.237125 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.237173 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.237198 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.237218 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:43Z","lastTransitionTime":"2026-03-20T07:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.340198 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.340419 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.340453 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.340489 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.340515 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:43Z","lastTransitionTime":"2026-03-20T07:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.444264 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.444369 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.444432 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.444465 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.444530 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:43Z","lastTransitionTime":"2026-03-20T07:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.548357 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.548455 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.548475 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.548499 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.548516 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:43Z","lastTransitionTime":"2026-03-20T07:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.652001 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.652072 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.652090 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.652116 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.652133 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:43Z","lastTransitionTime":"2026-03-20T07:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.754688 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.754775 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.754801 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.754833 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.754856 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:43Z","lastTransitionTime":"2026-03-20T07:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.857352 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.857384 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.857392 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.857456 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.857468 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:43Z","lastTransitionTime":"2026-03-20T07:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.959407 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.959433 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.959442 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.959456 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:43 crc kubenswrapper[4749]: I0320 07:14:43.959467 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:43Z","lastTransitionTime":"2026-03-20T07:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.063011 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.063088 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.063115 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.063148 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.063171 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:44Z","lastTransitionTime":"2026-03-20T07:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:44 crc kubenswrapper[4749]: E0320 07:14:44.163499 4749 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.177096 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:44 crc kubenswrapper[4749]: E0320 07:14:44.177379 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.177861 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:44 crc kubenswrapper[4749]: E0320 07:14:44.178044 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.178467 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:44 crc kubenswrapper[4749]: E0320 07:14:44.178662 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.178942 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:44 crc kubenswrapper[4749]: E0320 07:14:44.179136 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.202179 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:44Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.219052 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:44Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.234244 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:44Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.254849 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://688e8fa067ea553fac09be724c46f16706c8b3463f09d6a4e2cfe3212027da17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:44Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.276637 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:44Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.294467 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19e11d475744ecbce4f3285124657c66590dc339fe1af7d863b19d129ca09bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:44Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:44 crc kubenswrapper[4749]: E0320 07:14:44.304602 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.311315 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:44Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.331008 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:44Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.348742 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:44Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.363440 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:44Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.376927 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:44Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.391607 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:44Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.409659 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:32Z\\\",\\\"message\\\":\\\"od event handler 3\\\\nI0320 07:14:32.081966 6927 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 07:14:32.082007 6927 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 07:14:32.081939 6927 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 07:14:32.082053 6927 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 07:14:32.082092 6927 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 07:14:32.082103 6927 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 07:14:32.082153 6927 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 07:14:32.082169 6927 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 07:14:32.082188 6927 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 07:14:32.082205 6927 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 07:14:32.082223 6927 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 07:14:32.082225 6927 factory.go:656] Stopping watch factory\\\\nI0320 07:14:32.082255 6927 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 07:14:32.082358 6927 ovnkube.go:599] Stopped ovnkube\\\\nI0320 07:14:32.082420 6927 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 07:14:32.082513 6927 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tdgcw_openshift-ovn-kubernetes(2153d97b-a108-49f8-b6c8-8223ea65b878)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:44Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.431973 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:44Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.444597 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a9a22f5-482c-4da4-b2c6-cb1b9dad05e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93eaff3eb0b1240b3d19fdd70f9a27c0543e794b7bb61e3e1886807a8a712758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://189ab91bc96e9893f362ea6fae4ae81880b230b84a3987760796360150187043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9041a702c52186ffb23b29e5a5ddeddefef6f576571a20f1f43027ff3225641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830983de883dd8f7cd7c3da3c23b2d33e795b3c75222381378c17d43f8fb435f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://830983de883dd8f7cd7c3da3c23b2d33e795b3c75222381378c17d43f8fb435f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:44Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.461152 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1384b577-c860-43e3-927f-3aa6d9eaadbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db71fc201b999f26a4841d7cff88cd6c415d1a2ad4920d354ed394ac8ad2982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:44Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.491011 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:44Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:44 crc kubenswrapper[4749]: I0320 07:14:44.510739 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:44Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.588399 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.588460 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.588476 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.588498 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.588515 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:45Z","lastTransitionTime":"2026-03-20T07:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:45 crc kubenswrapper[4749]: E0320 07:14:45.609889 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:45Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.615703 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.615762 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.615781 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.615804 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.615821 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:45Z","lastTransitionTime":"2026-03-20T07:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:45 crc kubenswrapper[4749]: E0320 07:14:45.635476 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:45Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.640810 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.640862 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.640879 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.640905 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.640923 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:45Z","lastTransitionTime":"2026-03-20T07:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:45 crc kubenswrapper[4749]: E0320 07:14:45.662427 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:45Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.668671 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.669073 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.669320 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.669535 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.669708 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:45Z","lastTransitionTime":"2026-03-20T07:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:45 crc kubenswrapper[4749]: E0320 07:14:45.692119 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:45Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.697602 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.697660 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.697678 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.697700 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:45 crc kubenswrapper[4749]: I0320 07:14:45.697717 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:45Z","lastTransitionTime":"2026-03-20T07:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:45 crc kubenswrapper[4749]: E0320 07:14:45.718139 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:45Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:45 crc kubenswrapper[4749]: E0320 07:14:45.718251 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 07:14:46 crc kubenswrapper[4749]: I0320 07:14:46.176252 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:46 crc kubenswrapper[4749]: I0320 07:14:46.176319 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:46 crc kubenswrapper[4749]: E0320 07:14:46.177373 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:46 crc kubenswrapper[4749]: I0320 07:14:46.176487 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:46 crc kubenswrapper[4749]: I0320 07:14:46.176379 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:46 crc kubenswrapper[4749]: E0320 07:14:46.177180 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:46 crc kubenswrapper[4749]: E0320 07:14:46.177512 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:46 crc kubenswrapper[4749]: E0320 07:14:46.177795 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:47 crc kubenswrapper[4749]: I0320 07:14:47.177664 4749 scope.go:117] "RemoveContainer" containerID="4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e" Mar 20 07:14:47 crc kubenswrapper[4749]: E0320 07:14:47.178056 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tdgcw_openshift-ovn-kubernetes(2153d97b-a108-49f8-b6c8-8223ea65b878)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" Mar 20 07:14:48 crc kubenswrapper[4749]: I0320 07:14:48.177349 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:48 crc kubenswrapper[4749]: I0320 07:14:48.177394 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:48 crc kubenswrapper[4749]: I0320 07:14:48.177420 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:48 crc kubenswrapper[4749]: E0320 07:14:48.177516 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:48 crc kubenswrapper[4749]: I0320 07:14:48.177592 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:48 crc kubenswrapper[4749]: E0320 07:14:48.177679 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:48 crc kubenswrapper[4749]: E0320 07:14:48.178108 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:48 crc kubenswrapper[4749]: E0320 07:14:48.178464 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:49 crc kubenswrapper[4749]: E0320 07:14:49.305408 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 07:14:50 crc kubenswrapper[4749]: I0320 07:14:50.176433 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:50 crc kubenswrapper[4749]: I0320 07:14:50.176478 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:50 crc kubenswrapper[4749]: I0320 07:14:50.176514 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:50 crc kubenswrapper[4749]: E0320 07:14:50.176636 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:50 crc kubenswrapper[4749]: I0320 07:14:50.176668 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:50 crc kubenswrapper[4749]: E0320 07:14:50.176806 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:50 crc kubenswrapper[4749]: E0320 07:14:50.176952 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:50 crc kubenswrapper[4749]: E0320 07:14:50.177074 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.588664 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rcq9v_3f813da7-84d4-4550-ad66-f282814444a3/kube-multus/0.log" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.588715 4749 generic.go:334] "Generic (PLEG): container finished" podID="3f813da7-84d4-4550-ad66-f282814444a3" containerID="f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008" exitCode=1 Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.588743 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rcq9v" event={"ID":"3f813da7-84d4-4550-ad66-f282814444a3","Type":"ContainerDied","Data":"f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008"} Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.589146 4749 scope.go:117] "RemoveContainer" containerID="f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.607341 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19e11d475744ecbce4f3285124657c66590dc339fe1af7d863b19d129ca09bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:51Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.622522 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:51Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.635091 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:51Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.650542 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://688e8fa067ea553fac09be724c46f16706c8b3463f09d6a4e2cfe3212027da17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:51Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.664340 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:51Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.680229 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:51Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.694192 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:51Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.707023 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:51Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.722214 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:51Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.735692 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:51Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.750864 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:51Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.769275 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"message\\\":\\\"2026-03-20T07:14:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_02827f57-c4fa-48b0-ac23-f34108fdb778\\\\n2026-03-20T07:14:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_02827f57-c4fa-48b0-ac23-f34108fdb778 to /host/opt/cni/bin/\\\\n2026-03-20T07:14:06Z [verbose] multus-daemon started\\\\n2026-03-20T07:14:06Z [verbose] Readiness Indicator file check\\\\n2026-03-20T07:14:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:51Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.773715 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.800377 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:32Z\\\",\\\"message\\\":\\\"od event handler 3\\\\nI0320 07:14:32.081966 6927 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 07:14:32.082007 6927 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 07:14:32.081939 6927 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 07:14:32.082053 6927 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 07:14:32.082092 6927 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 07:14:32.082103 6927 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 07:14:32.082153 6927 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 07:14:32.082169 6927 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 07:14:32.082188 6927 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 07:14:32.082205 6927 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 07:14:32.082223 6927 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 07:14:32.082225 6927 factory.go:656] Stopping watch factory\\\\nI0320 07:14:32.082255 6927 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 07:14:32.082358 6927 ovnkube.go:599] Stopped ovnkube\\\\nI0320 07:14:32.082420 6927 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 07:14:32.082513 6927 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tdgcw_openshift-ovn-kubernetes(2153d97b-a108-49f8-b6c8-8223ea65b878)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:51Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.835429 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:51Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.855785 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a9a22f5-482c-4da4-b2c6-cb1b9dad05e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93eaff3eb0b1240b3d19fdd70f9a27c0543e794b7bb61e3e1886807a8a712758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://189ab91bc96e9893f362ea6fae4ae81880b230b84a3987760796360150187043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9041a702c52186ffb23b29e5a5ddeddefef6f576571a20f1f43027ff3225641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830983de883dd8f7cd7c3da3c23b2d33e795b3c75222381378c17d43f8fb435f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://830983de883dd8f7cd7c3da3c23b2d33e795b3c75222381378c17d43f8fb435f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:51Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.871611 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1384b577-c860-43e3-927f-3aa6d9eaadbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db71fc201b999f26a4841d7cff88cd6c415d1a2ad4920d354ed394ac8ad2982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:51Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.891407 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:51Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.906682 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:51Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.920883 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:51Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.936354 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:51Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.956579 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://688e8fa067ea553fac09be724c46f16706c8b3463f09d6a4e2cfe3212027da17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:51Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.976488 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:51Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:51 crc kubenswrapper[4749]: I0320 07:14:51.998408 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19e11d475744ecbce4f3285124657c66590dc339fe1af7d863b19d129ca09bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:51Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.018388 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.035196 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.049581 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.068891 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.086517 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.105757 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"message\\\":\\\"2026-03-20T07:14:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_02827f57-c4fa-48b0-ac23-f34108fdb778\\\\n2026-03-20T07:14:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_02827f57-c4fa-48b0-ac23-f34108fdb778 to /host/opt/cni/bin/\\\\n2026-03-20T07:14:06Z [verbose] multus-daemon started\\\\n2026-03-20T07:14:06Z [verbose] Readiness Indicator file check\\\\n2026-03-20T07:14:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.126266 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:32Z\\\",\\\"message\\\":\\\"od event handler 3\\\\nI0320 07:14:32.081966 6927 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 07:14:32.082007 6927 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 07:14:32.081939 6927 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 07:14:32.082053 6927 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 07:14:32.082092 6927 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 07:14:32.082103 6927 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 07:14:32.082153 6927 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 07:14:32.082169 6927 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 07:14:32.082188 6927 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 07:14:32.082205 6927 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 07:14:32.082223 6927 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 07:14:32.082225 6927 factory.go:656] Stopping watch factory\\\\nI0320 07:14:32.082255 6927 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 07:14:32.082358 6927 ovnkube.go:599] Stopped ovnkube\\\\nI0320 07:14:32.082420 6927 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 07:14:32.082513 6927 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tdgcw_openshift-ovn-kubernetes(2153d97b-a108-49f8-b6c8-8223ea65b878)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.158606 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.171715 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a9a22f5-482c-4da4-b2c6-cb1b9dad05e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93eaff3eb0b1240b3d19fdd70f9a27c0543e794b7bb61e3e1886807a8a712758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://189ab91bc96e9893f362ea6fae4ae81880b230b84a3987760796360150187043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9041a702c52186ffb23b29e5a5ddeddefef6f576571a20f1f43027ff3225641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830983de883dd8f7cd7c3da3c23b2d33e795b3c75222381378c17d43f8fb435f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://830983de883dd8f7cd7c3da3c23b2d33e795b3c75222381378c17d43f8fb435f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.176409 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.176488 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.176545 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.176625 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:52 crc kubenswrapper[4749]: E0320 07:14:52.176674 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:52 crc kubenswrapper[4749]: E0320 07:14:52.176505 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:52 crc kubenswrapper[4749]: E0320 07:14:52.176745 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:52 crc kubenswrapper[4749]: E0320 07:14:52.176903 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.186027 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1384b577-c860-43e3-927f-3aa6d9eaadbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db71fc201b999f26a4841d7cff88cd6c415d1a2ad4920d354ed394ac8ad2982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.199928 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.213326 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.226263 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.594930 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rcq9v_3f813da7-84d4-4550-ad66-f282814444a3/kube-multus/0.log" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.595988 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rcq9v" event={"ID":"3f813da7-84d4-4550-ad66-f282814444a3","Type":"ContainerStarted","Data":"290c8178fc52bf0ce040051ac3f6e31f5f5245203c3a61c98c6a723710fbb94b"} Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.618633 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.636917 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.652524 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.670350 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://290c8178fc52bf0ce040051ac3f6e31f5f5245203c3a61c98c6a723710fbb94b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"message\\\":\\\"2026-03-20T07:14:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_02827f57-c4fa-48b0-ac23-f34108fdb778\\\\n2026-03-20T07:14:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_02827f57-c4fa-48b0-ac23-f34108fdb778 to /host/opt/cni/bin/\\\\n2026-03-20T07:14:06Z [verbose] multus-daemon started\\\\n2026-03-20T07:14:06Z [verbose] Readiness Indicator file check\\\\n2026-03-20T07:14:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.691099 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:32Z\\\",\\\"message\\\":\\\"od event handler 3\\\\nI0320 07:14:32.081966 6927 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 07:14:32.082007 6927 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 07:14:32.081939 6927 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 07:14:32.082053 6927 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 07:14:32.082092 6927 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 07:14:32.082103 6927 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 07:14:32.082153 6927 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 07:14:32.082169 6927 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 07:14:32.082188 6927 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 07:14:32.082205 6927 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 07:14:32.082223 6927 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 07:14:32.082225 6927 factory.go:656] Stopping watch factory\\\\nI0320 07:14:32.082255 6927 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 07:14:32.082358 6927 ovnkube.go:599] Stopped ovnkube\\\\nI0320 07:14:32.082420 6927 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 07:14:32.082513 6927 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tdgcw_openshift-ovn-kubernetes(2153d97b-a108-49f8-b6c8-8223ea65b878)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.725016 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.741338 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a9a22f5-482c-4da4-b2c6-cb1b9dad05e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93eaff3eb0b1240b3d19fdd70f9a27c0543e794b7bb61e3e1886807a8a712758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://189ab91bc96e9893f362ea6fae4ae81880b230b84a3987760796360150187043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9041a702c52186ffb23b29e5a5ddeddefef6f576571a20f1f43027ff3225641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830983de883dd8f7cd7c3da3c23b2d33e795b3c75222381378c17d43f8fb435f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://830983de883dd8f7cd7c3da3c23b2d33e795b3c75222381378c17d43f8fb435f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.758386 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1384b577-c860-43e3-927f-3aa6d9eaadbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db71fc201b999f26a4841d7cff88cd6c415d1a2ad4920d354ed394ac8ad2982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.777783 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.792377 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.811427 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19e11d475744ecbce4f3285124657c66590dc339fe1af7d863b19d129ca09bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.825192 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.840848 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.861413 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://688e8fa067ea553fac09be724c46f16706c8b3463f09d6a4e2cfe3212027da17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.873750 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.889217 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.902686 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:52 crc kubenswrapper[4749]: I0320 07:14:52.919417 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:52Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:54 crc kubenswrapper[4749]: I0320 07:14:54.176574 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:54 crc kubenswrapper[4749]: I0320 07:14:54.176704 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:54 crc kubenswrapper[4749]: I0320 07:14:54.176775 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:54 crc kubenswrapper[4749]: E0320 07:14:54.176838 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:54 crc kubenswrapper[4749]: E0320 07:14:54.176962 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:54 crc kubenswrapper[4749]: I0320 07:14:54.177062 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:54 crc kubenswrapper[4749]: E0320 07:14:54.177119 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:54 crc kubenswrapper[4749]: E0320 07:14:54.177275 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:54 crc kubenswrapper[4749]: I0320 07:14:54.193791 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:54Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:54 crc kubenswrapper[4749]: I0320 07:14:54.208821 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:54Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:54 crc kubenswrapper[4749]: I0320 07:14:54.222799 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://688e8fa067ea553fac09be724c46f16706c8b3463f09d6a4e2cfe3212027da17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:54Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:54 crc kubenswrapper[4749]: I0320 07:14:54.236079 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:54Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:54 crc kubenswrapper[4749]: I0320 07:14:54.259165 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19e11d475744ecbce4f3285124657c66590dc339fe1af7d863b19d129ca09bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:54Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:54 crc kubenswrapper[4749]: I0320 07:14:54.276624 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:54Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:54 crc kubenswrapper[4749]: I0320 07:14:54.293438 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:54Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:54 crc kubenswrapper[4749]: E0320 07:14:54.306014 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 07:14:54 crc kubenswrapper[4749]: I0320 07:14:54.312257 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:54Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:54 crc kubenswrapper[4749]: I0320 07:14:54.329424 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:54Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:54 crc kubenswrapper[4749]: I0320 07:14:54.342967 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:54Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:54 crc kubenswrapper[4749]: I0320 07:14:54.361528 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://290c8178fc52bf0ce040051ac3f6e31f5f5245203c3a61c98c6a723710fbb94b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"message\\\":\\\"2026-03-20T07:14:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_02827f57-c4fa-48b0-ac23-f34108fdb778\\\\n2026-03-20T07:14:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_02827f57-c4fa-48b0-ac23-f34108fdb778 to /host/opt/cni/bin/\\\\n2026-03-20T07:14:06Z [verbose] multus-daemon started\\\\n2026-03-20T07:14:06Z [verbose] Readiness Indicator file check\\\\n2026-03-20T07:14:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:54Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:54 crc kubenswrapper[4749]: I0320 07:14:54.388343 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:32Z\\\",\\\"message\\\":\\\"od event handler 3\\\\nI0320 07:14:32.081966 6927 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 07:14:32.082007 6927 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 07:14:32.081939 6927 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 07:14:32.082053 6927 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 07:14:32.082092 6927 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 07:14:32.082103 6927 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 07:14:32.082153 6927 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 07:14:32.082169 6927 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 07:14:32.082188 6927 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 07:14:32.082205 6927 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 07:14:32.082223 6927 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 07:14:32.082225 6927 factory.go:656] Stopping watch factory\\\\nI0320 07:14:32.082255 6927 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 07:14:32.082358 6927 ovnkube.go:599] Stopped ovnkube\\\\nI0320 07:14:32.082420 6927 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 07:14:32.082513 6927 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tdgcw_openshift-ovn-kubernetes(2153d97b-a108-49f8-b6c8-8223ea65b878)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:54Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:54 crc kubenswrapper[4749]: I0320 07:14:54.421727 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:54Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:54 crc kubenswrapper[4749]: I0320 07:14:54.436762 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a9a22f5-482c-4da4-b2c6-cb1b9dad05e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93eaff3eb0b1240b3d19fdd70f9a27c0543e794b7bb61e3e1886807a8a712758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://189ab91bc96e9893f362ea6fae4ae81880b230b84a3987760796360150187043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9041a702c52186ffb23b29e5a5ddeddefef6f576571a20f1f43027ff3225641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830983de883dd8f7cd7c3da3c23b2d33e795b3c75222381378c17d43f8fb435f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://830983de883dd8f7cd7c3da3c23b2d33e795b3c75222381378c17d43f8fb435f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:54Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:54 crc kubenswrapper[4749]: I0320 07:14:54.452628 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1384b577-c860-43e3-927f-3aa6d9eaadbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db71fc201b999f26a4841d7cff88cd6c415d1a2ad4920d354ed394ac8ad2982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:54Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:54 crc kubenswrapper[4749]: I0320 07:14:54.468336 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:54Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:54 crc kubenswrapper[4749]: I0320 07:14:54.486114 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:54Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:54 crc kubenswrapper[4749]: I0320 07:14:54.501085 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:54Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.751621 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.751681 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.751693 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.751708 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.751718 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:55Z","lastTransitionTime":"2026-03-20T07:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:55 crc kubenswrapper[4749]: E0320 07:14:55.771151 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:55Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.776440 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.776500 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.776519 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.776549 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.776573 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:55Z","lastTransitionTime":"2026-03-20T07:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:55 crc kubenswrapper[4749]: E0320 07:14:55.798207 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:55Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.803604 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.803662 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.803674 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.803694 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.803705 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:55Z","lastTransitionTime":"2026-03-20T07:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:55 crc kubenswrapper[4749]: E0320 07:14:55.819620 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:55Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.824130 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.824181 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.824192 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.824208 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.824220 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:55Z","lastTransitionTime":"2026-03-20T07:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:55 crc kubenswrapper[4749]: E0320 07:14:55.838243 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:55Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.841970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.842015 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.842025 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.842042 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:14:55 crc kubenswrapper[4749]: I0320 07:14:55.842055 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:14:55Z","lastTransitionTime":"2026-03-20T07:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:14:55 crc kubenswrapper[4749]: E0320 07:14:55.856425 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:14:55Z is after 2025-08-24T17:21:41Z" Mar 20 07:14:55 crc kubenswrapper[4749]: E0320 07:14:55.856571 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 07:14:56 crc kubenswrapper[4749]: I0320 07:14:56.176519 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:56 crc kubenswrapper[4749]: I0320 07:14:56.176573 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:56 crc kubenswrapper[4749]: I0320 07:14:56.176577 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:56 crc kubenswrapper[4749]: E0320 07:14:56.176719 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:56 crc kubenswrapper[4749]: I0320 07:14:56.176745 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:56 crc kubenswrapper[4749]: E0320 07:14:56.176915 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:56 crc kubenswrapper[4749]: E0320 07:14:56.177074 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:56 crc kubenswrapper[4749]: E0320 07:14:56.177226 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:58 crc kubenswrapper[4749]: I0320 07:14:58.176372 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:14:58 crc kubenswrapper[4749]: I0320 07:14:58.176933 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:14:58 crc kubenswrapper[4749]: E0320 07:14:58.177101 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:14:58 crc kubenswrapper[4749]: I0320 07:14:58.177359 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:14:58 crc kubenswrapper[4749]: I0320 07:14:58.177410 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:14:58 crc kubenswrapper[4749]: E0320 07:14:58.178004 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:14:58 crc kubenswrapper[4749]: E0320 07:14:58.178206 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:14:58 crc kubenswrapper[4749]: E0320 07:14:58.178411 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:14:58 crc kubenswrapper[4749]: I0320 07:14:58.192854 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 07:14:59 crc kubenswrapper[4749]: E0320 07:14:59.308023 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.176682 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.176708 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.176726 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.177028 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:00 crc kubenswrapper[4749]: E0320 07:15:00.177141 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:00 crc kubenswrapper[4749]: E0320 07:15:00.177318 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:00 crc kubenswrapper[4749]: E0320 07:15:00.177388 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.177434 4749 scope.go:117] "RemoveContainer" containerID="4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e" Mar 20 07:15:00 crc kubenswrapper[4749]: E0320 07:15:00.177491 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.623672 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdgcw_2153d97b-a108-49f8-b6c8-8223ea65b878/ovnkube-controller/2.log" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.625897 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerStarted","Data":"b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9"} Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.626397 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.644576 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:32Z\\\",\\\"message\\\":\\\"od event handler 3\\\\nI0320 07:14:32.081966 6927 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 07:14:32.082007 6927 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 07:14:32.081939 6927 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 07:14:32.082053 6927 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 07:14:32.082092 6927 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 07:14:32.082103 6927 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 07:14:32.082153 6927 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 07:14:32.082169 6927 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 07:14:32.082188 6927 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 07:14:32.082205 6927 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 07:14:32.082223 6927 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 07:14:32.082225 6927 factory.go:656] Stopping watch factory\\\\nI0320 07:14:32.082255 6927 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 07:14:32.082358 6927 ovnkube.go:599] Stopped ovnkube\\\\nI0320 07:14:32.082420 6927 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 07:14:32.082513 6927 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:00Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.671163 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:00Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.682236 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a9a22f5-482c-4da4-b2c6-cb1b9dad05e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93eaff3eb0b1240b3d19fdd70f9a27c0543e794b7bb61e3e1886807a8a712758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://189ab91bc96e9893f362ea6fae4ae81880b230b84a3987760796360150187043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9041a702c52186ffb23b29e5a5ddeddefef6f576571a20f1f43027ff3225641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830983de883dd8f7cd7c3da3c23b2d33e795b3c75222381378c17d43f8fb435f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://830983de883dd8f7cd7c3da3c23b2d33e795b3c75222381378c17d43f8fb435f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:00Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.693449 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1384b577-c860-43e3-927f-3aa6d9eaadbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db71fc201b999f26a4841d7cff88cd6c415d1a2ad4920d354ed394ac8ad2982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:00Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.717430 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:00Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.739303 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:00Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.756909 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:00Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.769087 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://290c8178fc52bf0ce040051ac3f6e31f5f5245203c3a61c98c6a723710fbb94b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"message\\\":\\\"2026-03-20T07:14:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_02827f57-c4fa-48b0-ac23-f34108fdb778\\\\n2026-03-20T07:14:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_02827f57-c4fa-48b0-ac23-f34108fdb778 to /host/opt/cni/bin/\\\\n2026-03-20T07:14:06Z [verbose] multus-daemon started\\\\n2026-03-20T07:14:06Z [verbose] Readiness Indicator file check\\\\n2026-03-20T07:14:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:00Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.784606 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:00Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.796316 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:00Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.814195 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d3a080d-fda8-4885-b55d-3f5df3f85600\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a65618f22b785dac8d848207a38f7fb6d287be2ab31a4eac9731a364dd487702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bf150976cb265b706cccb6e625a9a0a06d47f2dbd69d032957551966e691e43\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:11Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 07:12:46.270127 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 07:12:46.272348 1 observer_polling.go:159] Starting file observer\\\\nI0320 07:12:46.302896 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 07:12:46.308966 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 07:13:11.822339 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 07:13:11.822515 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:13:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ee1edb21f5116ef152f5808824f0529ac3bc52a3959df7a21c031da45b5284a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92f4faf3fc2d9dbe3235934b4065feba40ae56c54cf67e5792a9183f56014fd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb647173cc4db7ee41f0b9e18c802b8d111c5f0819c619960b4b29f4f698e558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:00Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.830403 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://688e8fa067ea553fac09be724c46f16706c8b3463f09d6a4e2cfe3212027da17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:00Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.844729 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:00Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.862024 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19e11d475744ecbce4f3285124657c66590dc339fe1af7d863b19d129ca09bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:00Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.875374 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:00Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.887727 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:00Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.901413 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:00Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.918125 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:00Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:00 crc kubenswrapper[4749]: I0320 07:15:00.929171 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:00Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:01 crc kubenswrapper[4749]: I0320 07:15:01.630899 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdgcw_2153d97b-a108-49f8-b6c8-8223ea65b878/ovnkube-controller/3.log" Mar 20 07:15:01 crc kubenswrapper[4749]: I0320 07:15:01.632475 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdgcw_2153d97b-a108-49f8-b6c8-8223ea65b878/ovnkube-controller/2.log" Mar 20 07:15:01 crc kubenswrapper[4749]: I0320 07:15:01.638008 4749 generic.go:334] "Generic (PLEG): container finished" podID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerID="b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9" exitCode=1 Mar 20 07:15:01 crc kubenswrapper[4749]: I0320 07:15:01.638098 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerDied","Data":"b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9"} Mar 20 07:15:01 crc kubenswrapper[4749]: I0320 07:15:01.638190 4749 scope.go:117] "RemoveContainer" containerID="4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e" Mar 20 07:15:01 crc kubenswrapper[4749]: I0320 07:15:01.639451 4749 scope.go:117] "RemoveContainer" containerID="b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9" Mar 20 07:15:01 crc kubenswrapper[4749]: E0320 07:15:01.639782 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tdgcw_openshift-ovn-kubernetes(2153d97b-a108-49f8-b6c8-8223ea65b878)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" Mar 20 07:15:01 crc kubenswrapper[4749]: I0320 07:15:01.663413 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:01Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:01 crc kubenswrapper[4749]: I0320 07:15:01.677718 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:01Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:01 crc kubenswrapper[4749]: I0320 07:15:01.690839 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:01Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:01 crc kubenswrapper[4749]: I0320 07:15:01.709115 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19e11d475744ecbce4f3285124657c66590dc339fe1af7d863b19d129ca09bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:01Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:01 crc kubenswrapper[4749]: I0320 07:15:01.722630 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:01Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:01 crc kubenswrapper[4749]: I0320 07:15:01.736571 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:01Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:01 crc kubenswrapper[4749]: I0320 07:15:01.754078 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d3a080d-fda8-4885-b55d-3f5df3f85600\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a65618f22b785dac8d848207a38f7fb6d287be2ab31a4eac9731a364dd487702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bf150976cb265b706cccb6e625a9a0a06d47f2dbd69d032957551966e691e43\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:11Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 07:12:46.270127 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 07:12:46.272348 1 observer_polling.go:159] Starting file observer\\\\nI0320 07:12:46.302896 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 07:12:46.308966 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 07:13:11.822339 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 07:13:11.822515 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:13:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ee1edb21f5116ef152f5808824f0529ac3bc52a3959df7a21c031da45b5284a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92f4faf3fc2d9dbe3235934b4065feba40ae56c54cf67e5792a9183f56014fd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb647173cc4db7ee41f0b9e18c802b8d111c5f0819c619960b4b29f4f698e558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:01Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:01 crc kubenswrapper[4749]: I0320 07:15:01.776227 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://688e8fa067ea553fac09be724c46f16706c8b3463f09d6a4e2cfe3212027da17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:01Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:01 crc kubenswrapper[4749]: I0320 07:15:01.794668 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:01Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:01 crc kubenswrapper[4749]: I0320 07:15:01.814685 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:01Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:01 crc kubenswrapper[4749]: I0320 07:15:01.835664 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:01Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:01 crc kubenswrapper[4749]: I0320 07:15:01.852105 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1384b577-c860-43e3-927f-3aa6d9eaadbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db71fc201b999f26a4841d7cff88cd6c415d1a2ad4920d354ed394ac8ad2982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:01Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:01 crc kubenswrapper[4749]: I0320 07:15:01.871717 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:01Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:01 crc kubenswrapper[4749]: I0320 07:15:01.889950 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:01Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:01 crc kubenswrapper[4749]: I0320 07:15:01.905986 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:01Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:01 crc kubenswrapper[4749]: I0320 07:15:01.924109 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://290c8178fc52bf0ce040051ac3f6e31f5f5245203c3a61c98c6a723710fbb94b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"message\\\":\\\"2026-03-20T07:14:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_02827f57-c4fa-48b0-ac23-f34108fdb778\\\\n2026-03-20T07:14:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_02827f57-c4fa-48b0-ac23-f34108fdb778 to /host/opt/cni/bin/\\\\n2026-03-20T07:14:06Z [verbose] multus-daemon started\\\\n2026-03-20T07:14:06Z [verbose] Readiness Indicator file check\\\\n2026-03-20T07:14:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:01Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:01 crc kubenswrapper[4749]: I0320 07:15:01.955460 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ea02e47ca928e345d9158ed3cad49551f45914a2501a428a0d1ad63e4bf933e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:32Z\\\",\\\"message\\\":\\\"od event handler 3\\\\nI0320 07:14:32.081966 6927 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 07:14:32.082007 6927 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 07:14:32.081939 6927 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 07:14:32.082053 6927 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 07:14:32.082092 6927 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 07:14:32.082103 6927 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 07:14:32.082153 6927 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 07:14:32.082169 6927 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 07:14:32.082188 6927 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 07:14:32.082205 6927 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 07:14:32.082223 6927 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 07:14:32.082225 6927 factory.go:656] Stopping watch factory\\\\nI0320 07:14:32.082255 6927 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 07:14:32.082358 6927 ovnkube.go:599] Stopped ovnkube\\\\nI0320 07:14:32.082420 6927 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 07:14:32.082513 6927 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:15:01Z\\\",\\\"message\\\":\\\"} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 07:15:01.113875 7279 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 07:15:01.114051 7279 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 07:15:01.111807 7279 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 07:15:01.114260 7279 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:01Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:01 crc kubenswrapper[4749]: I0320 07:15:01.988222 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:01Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.004774 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a9a22f5-482c-4da4-b2c6-cb1b9dad05e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93eaff3eb0b1240b3d19fdd70f9a27c0543e794b7bb61e3e1886807a8a712758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://189ab91bc96e9893f362ea6fae4ae81880b230b84a3987760796360150187043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9041a702c52186ffb23b29e5a5ddeddefef6f576571a20f1f43027ff3225641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830983de883dd8f7cd7c3da3c23b2d33e795b3c75222381378c17d43f8fb435f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://830983de883dd8f7cd7c3da3c23b2d33e795b3c75222381378c17d43f8fb435f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:02Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.177080 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.177187 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.177106 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.177311 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:02 crc kubenswrapper[4749]: E0320 07:15:02.177331 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:02 crc kubenswrapper[4749]: E0320 07:15:02.177446 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:02 crc kubenswrapper[4749]: E0320 07:15:02.177582 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:02 crc kubenswrapper[4749]: E0320 07:15:02.177723 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.645594 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdgcw_2153d97b-a108-49f8-b6c8-8223ea65b878/ovnkube-controller/3.log" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.650556 4749 scope.go:117] "RemoveContainer" containerID="b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9" Mar 20 07:15:02 crc kubenswrapper[4749]: E0320 07:15:02.650918 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tdgcw_openshift-ovn-kubernetes(2153d97b-a108-49f8-b6c8-8223ea65b878)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.675113 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:02Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.696055 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:02Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.714377 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:02Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.732425 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:02Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.748236 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:02Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.768023 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://290c8178fc52bf0ce040051ac3f6e31f5f5245203c3a61c98c6a723710fbb94b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"message\\\":\\\"2026-03-20T07:14:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_02827f57-c4fa-48b0-ac23-f34108fdb778\\\\n2026-03-20T07:14:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_02827f57-c4fa-48b0-ac23-f34108fdb778 to /host/opt/cni/bin/\\\\n2026-03-20T07:14:06Z [verbose] multus-daemon started\\\\n2026-03-20T07:14:06Z [verbose] Readiness Indicator file check\\\\n2026-03-20T07:14:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:02Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.798089 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:15:01Z\\\",\\\"message\\\":\\\"} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 07:15:01.113875 7279 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 07:15:01.114051 7279 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 07:15:01.111807 7279 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 07:15:01.114260 7279 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:15:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tdgcw_openshift-ovn-kubernetes(2153d97b-a108-49f8-b6c8-8223ea65b878)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:02Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.828783 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:02Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.847113 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a9a22f5-482c-4da4-b2c6-cb1b9dad05e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93eaff3eb0b1240b3d19fdd70f9a27c0543e794b7bb61e3e1886807a8a712758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://189ab91bc96e9893f362ea6fae4ae81880b230b84a3987760796360150187043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9041a702c52186ffb23b29e5a5ddeddefef6f576571a20f1f43027ff3225641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830983de883dd8f7cd7c3da3c23b2d33e795b3c75222381378c17d43f8fb435f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://830983de883dd8f7cd7c3da3c23b2d33e795b3c75222381378c17d43f8fb435f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:02Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.858237 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1384b577-c860-43e3-927f-3aa6d9eaadbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db71fc201b999f26a4841d7cff88cd6c415d1a2ad4920d354ed394ac8ad2982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:02Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.876984 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:02Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.894999 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:02Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.909391 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:02Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.924404 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:02Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.938989 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:02Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.957465 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d3a080d-fda8-4885-b55d-3f5df3f85600\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a65618f22b785dac8d848207a38f7fb6d287be2ab31a4eac9731a364dd487702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bf150976cb265b706cccb6e625a9a0a06d47f2dbd69d032957551966e691e43\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:11Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 07:12:46.270127 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 07:12:46.272348 1 observer_polling.go:159] Starting file observer\\\\nI0320 07:12:46.302896 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 07:12:46.308966 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 07:13:11.822339 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 07:13:11.822515 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:13:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ee1edb21f5116ef152f5808824f0529ac3bc52a3959df7a21c031da45b5284a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92f4faf3fc2d9dbe3235934b4065feba40ae56c54cf67e5792a9183f56014fd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb647173cc4db7ee41f0b9e18c802b8d111c5f0819c619960b4b29f4f698e558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:02Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.976911 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://688e8fa067ea553fac09be724c46f16706c8b3463f09d6a4e2cfe3212027da17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:02Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:02 crc kubenswrapper[4749]: I0320 07:15:02.990588 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:02Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:03 crc kubenswrapper[4749]: I0320 07:15:03.015093 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19e11d475744ecbce4f3285124657c66590dc339fe1af7d863b19d129ca09bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:03Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:04 crc kubenswrapper[4749]: I0320 07:15:04.176379 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:04 crc kubenswrapper[4749]: I0320 07:15:04.176443 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:04 crc kubenswrapper[4749]: E0320 07:15:04.176558 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:04 crc kubenswrapper[4749]: I0320 07:15:04.176575 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:04 crc kubenswrapper[4749]: I0320 07:15:04.176657 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:04 crc kubenswrapper[4749]: E0320 07:15:04.176708 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:04 crc kubenswrapper[4749]: E0320 07:15:04.176871 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:04 crc kubenswrapper[4749]: E0320 07:15:04.176985 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:04 crc kubenswrapper[4749]: I0320 07:15:04.191970 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fnwpn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdf0a692-3cf9-4abe-8b52-c81a040c0e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e24c08ed2a41f1d8a54c1c9edf5511f5ef6016bbdfa19cf6c40e8a639e1e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgjwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fnwpn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:04Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:04 crc kubenswrapper[4749]: I0320 07:15:04.206386 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k56zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d19b89e-d048-4656-b5ce-c637190ab678\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcf5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k56zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:04Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:04 crc kubenswrapper[4749]: I0320 07:15:04.223760 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d3a080d-fda8-4885-b55d-3f5df3f85600\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a65618f22b785dac8d848207a38f7fb6d287be2ab31a4eac9731a364dd487702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bf150976cb265b706cccb6e625a9a0a06d47f2dbd69d032957551966e691e43\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:11Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 07:12:46.270127 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 07:12:46.272348 1 observer_polling.go:159] Starting file observer\\\\nI0320 07:12:46.302896 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 07:12:46.308966 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 07:13:11.822339 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 07:13:11.822515 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:13:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ee1edb21f5116ef152f5808824f0529ac3bc52a3959df7a21c031da45b5284a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92f4faf3fc2d9dbe3235934b4065feba40ae56c54cf67e5792a9183f56014fd5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb647173cc4db7ee41f0b9e18c802b8d111c5f0819c619960b4b29f4f698e558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:04Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:04 crc kubenswrapper[4749]: I0320 07:15:04.244451 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36aabe4-f4b7-4552-848b-0c22f7ac4753\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://688e8fa067ea553fac09be724c46f16706c8b3463f09d6a4e2cfe3212027da17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T07:13:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 07:13:46.902726 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 07:13:46.902897 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 07:13:46.903679 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-172154110/tls.crt::/tmp/serving-cert-172154110/tls.key\\\\\\\"\\\\nI0320 07:13:47.353972 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 07:13:47.356217 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 07:13:47.356236 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 07:13:47.356252 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 07:13:47.356257 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 07:13:47.360047 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 07:13:47.360067 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 07:13:47.360094 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360107 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 07:13:47.360118 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 07:13:47.360127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 07:13:47.360134 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 07:13:47.360142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 07:13:47.361128 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:13:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:04Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:04 crc kubenswrapper[4749]: I0320 07:15:04.258037 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:04Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:04 crc kubenswrapper[4749]: I0320 07:15:04.279862 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19bf4391-88b7-43a0-9b6a-435261a44ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19e11d475744ecbce4f3285124657c66590dc339fe1af7d863b19d129ca09bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b1bc100821f5e088b25e1a3134eb3569f8d7f9d646e9a205f4e285de5347eaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f6c2ad6eb77414914f68c969c86a78213e8484f063a3ba8c264d515d29c17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc898e4e80ba0788b3b9d1a6222eafa6ff7c0f832e729c7d5ec9cf5a9e12ff18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://263bd63d005d0bec35609f642694605cd9fea786c2842c857d7f5efc7a6b6af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e9ed8774358234497eacb680940c422ab5086a1baa09efae600fd7e0080c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://812f9fb3af64c0bcffc23b7bef225d20328fe3348d910b174c99f2330ef75bcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppqc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g4qlg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:04Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:04 crc kubenswrapper[4749]: I0320 07:15:04.294904 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c28e09a3eab6484907f72f0c4e3809f5a04d1b344fba717e6639f97c544acf83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:04Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:04 crc kubenswrapper[4749]: E0320 07:15:04.308654 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 07:15:04 crc kubenswrapper[4749]: I0320 07:15:04.312228 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:04Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:04 crc kubenswrapper[4749]: I0320 07:15:04.326250 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12151228-1cb9-4086-9a62-f4a9583f5f69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://727db5182d25f135cceb40ce56d93c74fd6ff79a08e042fded129a1b8c96eb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w749n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:04Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:04 crc kubenswrapper[4749]: I0320 07:15:04.338141 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf9225edfc659321c44243f73dd56d4661a1d16c7c2a53b7ef69768d6b88f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:04Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:04 crc kubenswrapper[4749]: I0320 07:15:04.352165 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r9vtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5fc763-08fb-4b02-a3cd-6f85310f0e14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38353c2a5737bf0e7e3552efcf7c55c31fded95481f21b06f9d364a944dbeebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x656g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r9vtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:04Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:04 crc kubenswrapper[4749]: I0320 07:15:04.371054 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rcq9v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f813da7-84d4-4550-ad66-f282814444a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://290c8178fc52bf0ce040051ac3f6e31f5f5245203c3a61c98c6a723710fbb94b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:14:51Z\\\",\\\"message\\\":\\\"2026-03-20T07:14:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_02827f57-c4fa-48b0-ac23-f34108fdb778\\\\n2026-03-20T07:14:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_02827f57-c4fa-48b0-ac23-f34108fdb778 to /host/opt/cni/bin/\\\\n2026-03-20T07:14:06Z [verbose] multus-daemon started\\\\n2026-03-20T07:14:06Z [verbose] Readiness Indicator file check\\\\n2026-03-20T07:14:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkbh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rcq9v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:04Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:04 crc kubenswrapper[4749]: I0320 07:15:04.398804 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2153d97b-a108-49f8-b6c8-8223ea65b878\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T07:15:01Z\\\",\\\"message\\\":\\\"} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 07:15:01.113875 7279 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 07:15:01.114051 7279 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 07:15:01.111807 7279 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 07:15:01.114260 7279 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T07:15:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tdgcw_openshift-ovn-kubernetes(2153d97b-a108-49f8-b6c8-8223ea65b878)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77vkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tdgcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:04Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:04 crc kubenswrapper[4749]: I0320 07:15:04.431178 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ade5670f-28bc-4c68-b28c-cec1ee830afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f986500b6f9e4ab2cf3366a7e05e9274f9192bdc576e52c82f8dafc9f1ce37c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0549d8b1f6c2132ed8356ac6c67078f9431cf7a9b057922e0ba5e2eb9f7f3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f97471d68760ce0f43e5c1c0bafa7c6b429812dd58e2b2fa2eabd378a0789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478f80cc5a3895ca8ae8adbaa46990b39941e2824b2bf1c93ca34bb7d15cbdd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a361154efd67448ce4f9008639d02864d9d3aa766b0937f3729b13a5d0b8948a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b441d98bffd8ea5991e6d36777747f70c6cefeb4b2ebeb2c6ce423735740b347\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf311589937a4a32d4e7b80ae0488f878e8e5393ae94a068eae7fbbe16280915\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6fe60e02f51f099aa628e5090243f8c4543bdbe2de67d06e5b093929fee9ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:04Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:04 crc kubenswrapper[4749]: I0320 07:15:04.449697 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a9a22f5-482c-4da4-b2c6-cb1b9dad05e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93eaff3eb0b1240b3d19fdd70f9a27c0543e794b7bb61e3e1886807a8a712758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://189ab91bc96e9893f362ea6fae4ae81880b230b84a3987760796360150187043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9041a702c52186ffb23b29e5a5ddeddefef6f576571a20f1f43027ff3225641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830983de883dd8f7cd7c3da3c23b2d33e795b3c75222381378c17d43f8fb435f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://830983de883dd8f7cd7c3da3c23b2d33e795b3c75222381378c17d43f8fb435f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:04Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:04 crc kubenswrapper[4749]: I0320 07:15:04.465120 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1384b577-c860-43e3-927f-3aa6d9eaadbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:12:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3db71fc201b999f26a4841d7cff88cd6c415d1a2ad4920d354ed394ac8ad2982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49ef347d178defb70362fc7330bec72e266f77e6bd46c7ce4cf0c7018d585171\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T07:12:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T07:12:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:12:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:04Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:04 crc kubenswrapper[4749]: I0320 07:15:04.481735 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:04Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:04 crc kubenswrapper[4749]: I0320 07:15:04.500406 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114c64c832173caefe8b9d0030fd0ac53be4c97636f0d1735ad2b5149e38ab28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78309ee345e93c6d9fb93f2f6cd3b3b80b2a7feec2b0fbca5962e00d978c66c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:04Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:04 crc kubenswrapper[4749]: I0320 07:15:04.517958 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7e40eb5-dd2b-4e2e-8ae8-afcb760595b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T07:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://080ba314933e5d85aef3e133f4372bc6e5881f2b3cf3ce1e769927e6798f328e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c96bd69f6d76b0604262b3105aafd077a3b603667218b7e6b81a5fcb0b49b3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T07:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T07:14:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-68xpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:04Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:05 crc kubenswrapper[4749]: I0320 07:15:05.957524 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:15:05 crc kubenswrapper[4749]: I0320 07:15:05.958191 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:15:05 crc kubenswrapper[4749]: I0320 07:15:05.958213 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:15:05 crc kubenswrapper[4749]: I0320 07:15:05.958249 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:15:05 crc kubenswrapper[4749]: I0320 07:15:05.958273 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:15:05Z","lastTransitionTime":"2026-03-20T07:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:15:05 crc kubenswrapper[4749]: E0320 07:15:05.982935 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:15:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:15:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:15:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:15:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:15:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:15:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:15:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:15:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:05Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:05 crc kubenswrapper[4749]: I0320 07:15:05.988707 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:15:05 crc kubenswrapper[4749]: I0320 07:15:05.988775 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:15:05 crc kubenswrapper[4749]: I0320 07:15:05.988789 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:15:05 crc kubenswrapper[4749]: I0320 07:15:05.988808 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:15:05 crc kubenswrapper[4749]: I0320 07:15:05.988825 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:15:05Z","lastTransitionTime":"2026-03-20T07:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:15:06 crc kubenswrapper[4749]: E0320 07:15:06.004254 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:15:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:15:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:15:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:15:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:15:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:15:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:15:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:15:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:06Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:06 crc kubenswrapper[4749]: I0320 07:15:06.008937 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:15:06 crc kubenswrapper[4749]: I0320 07:15:06.009001 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:15:06 crc kubenswrapper[4749]: I0320 07:15:06.009019 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:15:06 crc kubenswrapper[4749]: I0320 07:15:06.009048 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:15:06 crc kubenswrapper[4749]: I0320 07:15:06.009065 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:15:06Z","lastTransitionTime":"2026-03-20T07:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:15:06 crc kubenswrapper[4749]: E0320 07:15:06.029033 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:15:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:15:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:15:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:15:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:06Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:06 crc kubenswrapper[4749]: I0320 07:15:06.034613 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:15:06 crc kubenswrapper[4749]: I0320 07:15:06.034664 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:15:06 crc kubenswrapper[4749]: I0320 07:15:06.034681 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:15:06 crc kubenswrapper[4749]: I0320 07:15:06.034705 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:15:06 crc kubenswrapper[4749]: I0320 07:15:06.034723 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:15:06Z","lastTransitionTime":"2026-03-20T07:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:15:06 crc kubenswrapper[4749]: E0320 07:15:06.049508 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:15:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:15:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:15:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:15:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:06Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:06 crc kubenswrapper[4749]: I0320 07:15:06.053613 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:15:06 crc kubenswrapper[4749]: I0320 07:15:06.053692 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:15:06 crc kubenswrapper[4749]: I0320 07:15:06.053713 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:15:06 crc kubenswrapper[4749]: I0320 07:15:06.053742 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:15:06 crc kubenswrapper[4749]: I0320 07:15:06.053765 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:15:06Z","lastTransitionTime":"2026-03-20T07:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:15:06 crc kubenswrapper[4749]: E0320 07:15:06.072815 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:15:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:15:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:15:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T07:15:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T07:15:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e6cbc31b-af36-4be8-8e88-99f024097007\\\",\\\"systemUUID\\\":\\\"42f570dd-c9b2-4d24-870f-033a21aa11c5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T07:15:06Z is after 2025-08-24T17:21:41Z" Mar 20 07:15:06 crc kubenswrapper[4749]: E0320 07:15:06.072933 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 07:15:06 crc kubenswrapper[4749]: I0320 07:15:06.177244 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:06 crc kubenswrapper[4749]: I0320 07:15:06.177354 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:06 crc kubenswrapper[4749]: I0320 07:15:06.177362 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:06 crc kubenswrapper[4749]: E0320 07:15:06.177437 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:06 crc kubenswrapper[4749]: E0320 07:15:06.177520 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:06 crc kubenswrapper[4749]: I0320 07:15:06.177583 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:06 crc kubenswrapper[4749]: E0320 07:15:06.177602 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:06 crc kubenswrapper[4749]: E0320 07:15:06.177778 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:08 crc kubenswrapper[4749]: I0320 07:15:08.069931 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:15:08 crc kubenswrapper[4749]: I0320 07:15:08.070023 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:08 crc kubenswrapper[4749]: I0320 07:15:08.070047 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:08 crc kubenswrapper[4749]: E0320 07:15:08.070195 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 07:15:08 crc kubenswrapper[4749]: E0320 07:15:08.070196 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:12.070136269 +0000 UTC m=+208.619793936 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:15:08 crc kubenswrapper[4749]: E0320 07:15:08.070247 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 07:16:12.070231421 +0000 UTC m=+208.619889158 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 07:15:08 crc kubenswrapper[4749]: E0320 07:15:08.070261 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 07:15:08 crc kubenswrapper[4749]: E0320 07:15:08.070433 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 07:16:12.070396605 +0000 UTC m=+208.620054302 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 07:15:08 crc kubenswrapper[4749]: I0320 07:15:08.170531 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs\") pod \"network-metrics-daemon-k56zh\" (UID: \"6d19b89e-d048-4656-b5ce-c637190ab678\") " pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:08 crc kubenswrapper[4749]: I0320 07:15:08.170847 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:08 crc kubenswrapper[4749]: I0320 07:15:08.170964 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:08 crc kubenswrapper[4749]: E0320 07:15:08.170687 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 07:15:08 crc kubenswrapper[4749]: E0320 07:15:08.171172 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs podName:6d19b89e-d048-4656-b5ce-c637190ab678 nodeName:}" failed. No retries permitted until 2026-03-20 07:16:12.171155424 +0000 UTC m=+208.720813071 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs") pod "network-metrics-daemon-k56zh" (UID: "6d19b89e-d048-4656-b5ce-c637190ab678") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 07:15:08 crc kubenswrapper[4749]: E0320 07:15:08.170977 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 07:15:08 crc kubenswrapper[4749]: E0320 07:15:08.171321 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 07:15:08 crc kubenswrapper[4749]: E0320 07:15:08.171351 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:15:08 crc kubenswrapper[4749]: E0320 07:15:08.171100 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 07:15:08 crc kubenswrapper[4749]: E0320 07:15:08.171493 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 07:15:08 crc kubenswrapper[4749]: E0320 07:15:08.171526 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:15:08 crc kubenswrapper[4749]: E0320 07:15:08.171415 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 07:16:12.171396681 +0000 UTC m=+208.721054368 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:15:08 crc kubenswrapper[4749]: E0320 07:15:08.171627 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 07:16:12.171590756 +0000 UTC m=+208.721248453 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 07:15:08 crc kubenswrapper[4749]: I0320 07:15:08.176675 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:08 crc kubenswrapper[4749]: I0320 07:15:08.176754 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:08 crc kubenswrapper[4749]: E0320 07:15:08.176806 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:08 crc kubenswrapper[4749]: I0320 07:15:08.176830 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:08 crc kubenswrapper[4749]: I0320 07:15:08.176870 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:08 crc kubenswrapper[4749]: E0320 07:15:08.177046 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:08 crc kubenswrapper[4749]: E0320 07:15:08.177107 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:08 crc kubenswrapper[4749]: E0320 07:15:08.177224 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:09 crc kubenswrapper[4749]: E0320 07:15:09.310257 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 07:15:10 crc kubenswrapper[4749]: I0320 07:15:10.176748 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:10 crc kubenswrapper[4749]: E0320 07:15:10.176913 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:10 crc kubenswrapper[4749]: I0320 07:15:10.177180 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:10 crc kubenswrapper[4749]: I0320 07:15:10.177225 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:10 crc kubenswrapper[4749]: E0320 07:15:10.177343 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:10 crc kubenswrapper[4749]: E0320 07:15:10.177512 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:10 crc kubenswrapper[4749]: I0320 07:15:10.177550 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:10 crc kubenswrapper[4749]: E0320 07:15:10.177708 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:12 crc kubenswrapper[4749]: I0320 07:15:12.176376 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:12 crc kubenswrapper[4749]: I0320 07:15:12.176380 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:12 crc kubenswrapper[4749]: I0320 07:15:12.176628 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:12 crc kubenswrapper[4749]: E0320 07:15:12.176803 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:12 crc kubenswrapper[4749]: E0320 07:15:12.176906 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:12 crc kubenswrapper[4749]: I0320 07:15:12.176394 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:12 crc kubenswrapper[4749]: E0320 07:15:12.179856 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:12 crc kubenswrapper[4749]: E0320 07:15:12.179999 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:14 crc kubenswrapper[4749]: I0320 07:15:14.176415 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:14 crc kubenswrapper[4749]: I0320 07:15:14.176484 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:14 crc kubenswrapper[4749]: E0320 07:15:14.176545 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:14 crc kubenswrapper[4749]: E0320 07:15:14.176840 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:14 crc kubenswrapper[4749]: I0320 07:15:14.176904 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:14 crc kubenswrapper[4749]: I0320 07:15:14.176932 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:14 crc kubenswrapper[4749]: E0320 07:15:14.177012 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:14 crc kubenswrapper[4749]: E0320 07:15:14.177165 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:14 crc kubenswrapper[4749]: I0320 07:15:14.235072 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=67.235046997 podStartE2EDuration="1m7.235046997s" podCreationTimestamp="2026-03-20 07:14:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:15:14.233390053 +0000 UTC m=+150.783047780" watchObservedRunningTime="2026-03-20 07:15:14.235046997 +0000 UTC m=+150.784704684" Mar 20 07:15:14 crc kubenswrapper[4749]: I0320 07:15:14.265031 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=31.265013191 podStartE2EDuration="31.265013191s" podCreationTimestamp="2026-03-20 07:14:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:15:14.251485667 +0000 UTC m=+150.801143384" watchObservedRunningTime="2026-03-20 07:15:14.265013191 +0000 UTC m=+150.814670848" Mar 20 07:15:14 crc kubenswrapper[4749]: I0320 07:15:14.265195 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=56.265186906 podStartE2EDuration="56.265186906s" podCreationTimestamp="2026-03-20 07:14:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:15:14.264547249 +0000 UTC m=+150.814204916" watchObservedRunningTime="2026-03-20 07:15:14.265186906 +0000 UTC m=+150.814844563" Mar 20 07:15:14 crc kubenswrapper[4749]: I0320 07:15:14.310247 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-r9vtf" podStartSLOduration=94.310224816 podStartE2EDuration="1m34.310224816s" podCreationTimestamp="2026-03-20 07:13:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:15:14.309889547 +0000 UTC m=+150.859547214" watchObservedRunningTime="2026-03-20 07:15:14.310224816 +0000 UTC m=+150.859882473" Mar 20 07:15:14 crc kubenswrapper[4749]: E0320 07:15:14.311665 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 07:15:14 crc kubenswrapper[4749]: I0320 07:15:14.371257 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rcq9v" podStartSLOduration=93.371232774 podStartE2EDuration="1m33.371232774s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:15:14.331499773 +0000 UTC m=+150.881157460" watchObservedRunningTime="2026-03-20 07:15:14.371232774 +0000 UTC m=+150.920890431" Mar 20 07:15:14 crc kubenswrapper[4749]: I0320 07:15:14.416885 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-68xpr" podStartSLOduration=93.416864249 podStartE2EDuration="1m33.416864249s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:15:14.403697414 +0000 UTC m=+150.953355061" watchObservedRunningTime="2026-03-20 07:15:14.416864249 +0000 UTC m=+150.966521906" Mar 20 07:15:14 crc kubenswrapper[4749]: I0320 07:15:14.417064 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=16.417059184 podStartE2EDuration="16.417059184s" podCreationTimestamp="2026-03-20 07:14:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:15:14.416778857 +0000 UTC m=+150.966436524" watchObservedRunningTime="2026-03-20 07:15:14.417059184 +0000 UTC m=+150.966716851" Mar 20 07:15:14 crc kubenswrapper[4749]: I0320 07:15:14.465685 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=67.465662288 podStartE2EDuration="1m7.465662288s" podCreationTimestamp="2026-03-20 07:14:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:15:14.447256965 +0000 UTC m=+150.996914612" watchObservedRunningTime="2026-03-20 07:15:14.465662288 +0000 UTC m=+151.015319955" Mar 20 07:15:14 crc kubenswrapper[4749]: I0320 07:15:14.499064 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-g4qlg" podStartSLOduration=93.499048492 podStartE2EDuration="1m33.499048492s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:15:14.49819933 +0000 UTC m=+151.047856977" watchObservedRunningTime="2026-03-20 07:15:14.499048492 +0000 UTC m=+151.048706159" Mar 20 07:15:14 crc kubenswrapper[4749]: I0320 07:15:14.512680 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-fnwpn" podStartSLOduration=94.512661118 podStartE2EDuration="1m34.512661118s" podCreationTimestamp="2026-03-20 07:13:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:15:14.511035066 +0000 UTC m=+151.060692753" watchObservedRunningTime="2026-03-20 07:15:14.512661118 +0000 UTC m=+151.062318775" Mar 20 07:15:14 crc kubenswrapper[4749]: I0320 07:15:14.565838 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podStartSLOduration=94.565816831 podStartE2EDuration="1m34.565816831s" podCreationTimestamp="2026-03-20 07:13:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:15:14.565670187 +0000 UTC m=+151.115327834" watchObservedRunningTime="2026-03-20 07:15:14.565816831 +0000 UTC m=+151.115474498" Mar 20 07:15:15 crc kubenswrapper[4749]: I0320 07:15:15.178052 4749 scope.go:117] "RemoveContainer" containerID="b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9" Mar 20 07:15:15 crc kubenswrapper[4749]: E0320 07:15:15.178678 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tdgcw_openshift-ovn-kubernetes(2153d97b-a108-49f8-b6c8-8223ea65b878)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.176958 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.177067 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:16 crc kubenswrapper[4749]: E0320 07:15:16.177125 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:16 crc kubenswrapper[4749]: E0320 07:15:16.177257 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.177435 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:16 crc kubenswrapper[4749]: E0320 07:15:16.177679 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.178105 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:16 crc kubenswrapper[4749]: E0320 07:15:16.178469 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.228465 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.228501 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.228510 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.228525 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.228535 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T07:15:16Z","lastTransitionTime":"2026-03-20T07:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.283884 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-sh9k5"] Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.284577 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sh9k5" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.287128 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.287677 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.288020 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.288830 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.360875 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0ef92c3f-fa57-4f68-b94c-84416f8b2636-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sh9k5\" (UID: \"0ef92c3f-fa57-4f68-b94c-84416f8b2636\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sh9k5" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.360916 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ef92c3f-fa57-4f68-b94c-84416f8b2636-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sh9k5\" (UID: \"0ef92c3f-fa57-4f68-b94c-84416f8b2636\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sh9k5" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.360952 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0ef92c3f-fa57-4f68-b94c-84416f8b2636-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sh9k5\" (UID: \"0ef92c3f-fa57-4f68-b94c-84416f8b2636\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sh9k5" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.361035 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ef92c3f-fa57-4f68-b94c-84416f8b2636-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sh9k5\" (UID: \"0ef92c3f-fa57-4f68-b94c-84416f8b2636\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sh9k5" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.361125 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ef92c3f-fa57-4f68-b94c-84416f8b2636-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sh9k5\" (UID: \"0ef92c3f-fa57-4f68-b94c-84416f8b2636\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sh9k5" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.452805 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.462082 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ef92c3f-fa57-4f68-b94c-84416f8b2636-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sh9k5\" (UID: \"0ef92c3f-fa57-4f68-b94c-84416f8b2636\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sh9k5" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.462187 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0ef92c3f-fa57-4f68-b94c-84416f8b2636-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sh9k5\" (UID: \"0ef92c3f-fa57-4f68-b94c-84416f8b2636\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sh9k5" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.462226 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ef92c3f-fa57-4f68-b94c-84416f8b2636-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sh9k5\" (UID: \"0ef92c3f-fa57-4f68-b94c-84416f8b2636\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sh9k5" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.462316 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0ef92c3f-fa57-4f68-b94c-84416f8b2636-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sh9k5\" (UID: \"0ef92c3f-fa57-4f68-b94c-84416f8b2636\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sh9k5" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.462352 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ef92c3f-fa57-4f68-b94c-84416f8b2636-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sh9k5\" (UID: \"0ef92c3f-fa57-4f68-b94c-84416f8b2636\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sh9k5" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.462474 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0ef92c3f-fa57-4f68-b94c-84416f8b2636-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sh9k5\" (UID: \"0ef92c3f-fa57-4f68-b94c-84416f8b2636\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sh9k5" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.462605 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0ef92c3f-fa57-4f68-b94c-84416f8b2636-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sh9k5\" (UID: \"0ef92c3f-fa57-4f68-b94c-84416f8b2636\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sh9k5" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.462995 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ef92c3f-fa57-4f68-b94c-84416f8b2636-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sh9k5\" (UID: \"0ef92c3f-fa57-4f68-b94c-84416f8b2636\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sh9k5" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.463415 4749 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.472947 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ef92c3f-fa57-4f68-b94c-84416f8b2636-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sh9k5\" (UID: \"0ef92c3f-fa57-4f68-b94c-84416f8b2636\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sh9k5" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.490433 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ef92c3f-fa57-4f68-b94c-84416f8b2636-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sh9k5\" (UID: \"0ef92c3f-fa57-4f68-b94c-84416f8b2636\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sh9k5" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.605593 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sh9k5" Mar 20 07:15:16 crc kubenswrapper[4749]: I0320 07:15:16.695262 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sh9k5" event={"ID":"0ef92c3f-fa57-4f68-b94c-84416f8b2636","Type":"ContainerStarted","Data":"f12a25e18cee829f8992d93eaeaf6c6d9006dceb8b402974eefc075a339cd26c"} Mar 20 07:15:17 crc kubenswrapper[4749]: I0320 07:15:17.699664 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sh9k5" event={"ID":"0ef92c3f-fa57-4f68-b94c-84416f8b2636","Type":"ContainerStarted","Data":"92a90b6118559970000495ea11bb43b0e73739b806735697a1b037c9bc7a1a7b"} Mar 20 07:15:17 crc kubenswrapper[4749]: I0320 07:15:17.713164 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sh9k5" podStartSLOduration=97.713144114 podStartE2EDuration="1m37.713144114s" podCreationTimestamp="2026-03-20 07:13:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:15:17.712508438 +0000 UTC m=+154.262166085" watchObservedRunningTime="2026-03-20 07:15:17.713144114 +0000 UTC m=+154.262801761" Mar 20 07:15:18 crc kubenswrapper[4749]: I0320 07:15:18.177255 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:18 crc kubenswrapper[4749]: I0320 07:15:18.177364 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:18 crc kubenswrapper[4749]: I0320 07:15:18.177260 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:18 crc kubenswrapper[4749]: E0320 07:15:18.177518 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:18 crc kubenswrapper[4749]: E0320 07:15:18.177590 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:18 crc kubenswrapper[4749]: I0320 07:15:18.177597 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:18 crc kubenswrapper[4749]: E0320 07:15:18.177676 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:18 crc kubenswrapper[4749]: E0320 07:15:18.177818 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:19 crc kubenswrapper[4749]: E0320 07:15:19.705355 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 07:15:20 crc kubenswrapper[4749]: I0320 07:15:20.176678 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:20 crc kubenswrapper[4749]: I0320 07:15:20.176769 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:20 crc kubenswrapper[4749]: E0320 07:15:20.176825 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:20 crc kubenswrapper[4749]: I0320 07:15:20.176839 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:20 crc kubenswrapper[4749]: I0320 07:15:20.176908 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:20 crc kubenswrapper[4749]: E0320 07:15:20.177021 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:20 crc kubenswrapper[4749]: E0320 07:15:20.177202 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:20 crc kubenswrapper[4749]: E0320 07:15:20.177531 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:22 crc kubenswrapper[4749]: I0320 07:15:22.176791 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:22 crc kubenswrapper[4749]: I0320 07:15:22.176868 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:22 crc kubenswrapper[4749]: I0320 07:15:22.176823 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:22 crc kubenswrapper[4749]: I0320 07:15:22.176821 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:22 crc kubenswrapper[4749]: E0320 07:15:22.176981 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:22 crc kubenswrapper[4749]: E0320 07:15:22.177083 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:22 crc kubenswrapper[4749]: E0320 07:15:22.177179 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:22 crc kubenswrapper[4749]: E0320 07:15:22.177303 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:24 crc kubenswrapper[4749]: I0320 07:15:24.177003 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:24 crc kubenswrapper[4749]: I0320 07:15:24.177166 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:24 crc kubenswrapper[4749]: E0320 07:15:24.179390 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:24 crc kubenswrapper[4749]: I0320 07:15:24.179427 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:24 crc kubenswrapper[4749]: I0320 07:15:24.179563 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:24 crc kubenswrapper[4749]: E0320 07:15:24.179605 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:24 crc kubenswrapper[4749]: E0320 07:15:24.179748 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:24 crc kubenswrapper[4749]: E0320 07:15:24.179862 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:24 crc kubenswrapper[4749]: E0320 07:15:24.743214 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 07:15:26 crc kubenswrapper[4749]: I0320 07:15:26.176991 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:26 crc kubenswrapper[4749]: E0320 07:15:26.177423 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:26 crc kubenswrapper[4749]: I0320 07:15:26.177104 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:26 crc kubenswrapper[4749]: I0320 07:15:26.176996 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:26 crc kubenswrapper[4749]: E0320 07:15:26.177504 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:26 crc kubenswrapper[4749]: I0320 07:15:26.177155 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:26 crc kubenswrapper[4749]: E0320 07:15:26.177649 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:26 crc kubenswrapper[4749]: E0320 07:15:26.177795 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:27 crc kubenswrapper[4749]: I0320 07:15:27.177398 4749 scope.go:117] "RemoveContainer" containerID="b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9" Mar 20 07:15:27 crc kubenswrapper[4749]: E0320 07:15:27.177639 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tdgcw_openshift-ovn-kubernetes(2153d97b-a108-49f8-b6c8-8223ea65b878)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" Mar 20 07:15:28 crc kubenswrapper[4749]: I0320 07:15:28.176811 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:28 crc kubenswrapper[4749]: I0320 07:15:28.176953 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:28 crc kubenswrapper[4749]: I0320 07:15:28.177029 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:28 crc kubenswrapper[4749]: I0320 07:15:28.177058 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:28 crc kubenswrapper[4749]: E0320 07:15:28.177773 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:28 crc kubenswrapper[4749]: E0320 07:15:28.177604 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:28 crc kubenswrapper[4749]: E0320 07:15:28.177706 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:28 crc kubenswrapper[4749]: E0320 07:15:28.178603 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:29 crc kubenswrapper[4749]: E0320 07:15:29.744967 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 07:15:30 crc kubenswrapper[4749]: I0320 07:15:30.176813 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:30 crc kubenswrapper[4749]: I0320 07:15:30.176911 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:30 crc kubenswrapper[4749]: I0320 07:15:30.176978 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:30 crc kubenswrapper[4749]: I0320 07:15:30.176924 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:30 crc kubenswrapper[4749]: E0320 07:15:30.177089 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:30 crc kubenswrapper[4749]: E0320 07:15:30.177359 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:30 crc kubenswrapper[4749]: E0320 07:15:30.177477 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:30 crc kubenswrapper[4749]: E0320 07:15:30.177593 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:32 crc kubenswrapper[4749]: I0320 07:15:32.177224 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:32 crc kubenswrapper[4749]: I0320 07:15:32.177566 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:32 crc kubenswrapper[4749]: I0320 07:15:32.177642 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:32 crc kubenswrapper[4749]: E0320 07:15:32.177798 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:32 crc kubenswrapper[4749]: I0320 07:15:32.177854 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:32 crc kubenswrapper[4749]: E0320 07:15:32.178009 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:32 crc kubenswrapper[4749]: E0320 07:15:32.178142 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:32 crc kubenswrapper[4749]: E0320 07:15:32.178262 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:34 crc kubenswrapper[4749]: I0320 07:15:34.176848 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:34 crc kubenswrapper[4749]: I0320 07:15:34.176847 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:34 crc kubenswrapper[4749]: I0320 07:15:34.176901 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:34 crc kubenswrapper[4749]: I0320 07:15:34.176953 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:34 crc kubenswrapper[4749]: E0320 07:15:34.178187 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:34 crc kubenswrapper[4749]: E0320 07:15:34.178343 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:34 crc kubenswrapper[4749]: E0320 07:15:34.178690 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:34 crc kubenswrapper[4749]: E0320 07:15:34.179109 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:34 crc kubenswrapper[4749]: E0320 07:15:34.746022 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 07:15:36 crc kubenswrapper[4749]: I0320 07:15:36.176803 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:36 crc kubenswrapper[4749]: I0320 07:15:36.176955 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:36 crc kubenswrapper[4749]: E0320 07:15:36.177267 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:36 crc kubenswrapper[4749]: I0320 07:15:36.177391 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:36 crc kubenswrapper[4749]: I0320 07:15:36.177333 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:36 crc kubenswrapper[4749]: E0320 07:15:36.177608 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:36 crc kubenswrapper[4749]: E0320 07:15:36.177815 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:36 crc kubenswrapper[4749]: E0320 07:15:36.177981 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:37 crc kubenswrapper[4749]: I0320 07:15:37.815597 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rcq9v_3f813da7-84d4-4550-ad66-f282814444a3/kube-multus/1.log" Mar 20 07:15:37 crc kubenswrapper[4749]: I0320 07:15:37.816467 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rcq9v_3f813da7-84d4-4550-ad66-f282814444a3/kube-multus/0.log" Mar 20 07:15:37 crc kubenswrapper[4749]: I0320 07:15:37.816516 4749 generic.go:334] "Generic (PLEG): container finished" podID="3f813da7-84d4-4550-ad66-f282814444a3" containerID="290c8178fc52bf0ce040051ac3f6e31f5f5245203c3a61c98c6a723710fbb94b" exitCode=1 Mar 20 07:15:37 crc kubenswrapper[4749]: I0320 07:15:37.816565 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rcq9v" event={"ID":"3f813da7-84d4-4550-ad66-f282814444a3","Type":"ContainerDied","Data":"290c8178fc52bf0ce040051ac3f6e31f5f5245203c3a61c98c6a723710fbb94b"} Mar 20 07:15:37 crc kubenswrapper[4749]: I0320 07:15:37.816601 4749 scope.go:117] "RemoveContainer" containerID="f01cb7c52842132ac657c21cb0cac4167a7b0c07ac20803552a8290a0d19e008" Mar 20 07:15:37 crc kubenswrapper[4749]: I0320 07:15:37.816948 4749 scope.go:117] "RemoveContainer" containerID="290c8178fc52bf0ce040051ac3f6e31f5f5245203c3a61c98c6a723710fbb94b" Mar 20 07:15:37 crc kubenswrapper[4749]: E0320 07:15:37.817117 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-rcq9v_openshift-multus(3f813da7-84d4-4550-ad66-f282814444a3)\"" pod="openshift-multus/multus-rcq9v" podUID="3f813da7-84d4-4550-ad66-f282814444a3" Mar 20 07:15:38 crc kubenswrapper[4749]: I0320 07:15:38.176908 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:38 crc kubenswrapper[4749]: I0320 07:15:38.177016 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:38 crc kubenswrapper[4749]: E0320 07:15:38.177078 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:38 crc kubenswrapper[4749]: E0320 07:15:38.177184 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:38 crc kubenswrapper[4749]: I0320 07:15:38.177255 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:38 crc kubenswrapper[4749]: E0320 07:15:38.177384 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:38 crc kubenswrapper[4749]: I0320 07:15:38.177457 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:38 crc kubenswrapper[4749]: E0320 07:15:38.177720 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:38 crc kubenswrapper[4749]: I0320 07:15:38.179575 4749 scope.go:117] "RemoveContainer" containerID="b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9" Mar 20 07:15:38 crc kubenswrapper[4749]: E0320 07:15:38.180010 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tdgcw_openshift-ovn-kubernetes(2153d97b-a108-49f8-b6c8-8223ea65b878)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" Mar 20 07:15:38 crc kubenswrapper[4749]: I0320 07:15:38.821205 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rcq9v_3f813da7-84d4-4550-ad66-f282814444a3/kube-multus/1.log" Mar 20 07:15:39 crc kubenswrapper[4749]: E0320 07:15:39.747441 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 07:15:40 crc kubenswrapper[4749]: I0320 07:15:40.176370 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:40 crc kubenswrapper[4749]: I0320 07:15:40.176494 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:40 crc kubenswrapper[4749]: I0320 07:15:40.176551 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:40 crc kubenswrapper[4749]: I0320 07:15:40.176578 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:40 crc kubenswrapper[4749]: E0320 07:15:40.176639 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:40 crc kubenswrapper[4749]: E0320 07:15:40.176690 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:40 crc kubenswrapper[4749]: E0320 07:15:40.176797 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:40 crc kubenswrapper[4749]: E0320 07:15:40.176876 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:42 crc kubenswrapper[4749]: I0320 07:15:42.176253 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:42 crc kubenswrapper[4749]: I0320 07:15:42.176431 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:42 crc kubenswrapper[4749]: I0320 07:15:42.176468 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:42 crc kubenswrapper[4749]: E0320 07:15:42.176642 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:42 crc kubenswrapper[4749]: I0320 07:15:42.176769 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:42 crc kubenswrapper[4749]: E0320 07:15:42.176814 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:42 crc kubenswrapper[4749]: E0320 07:15:42.176967 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:42 crc kubenswrapper[4749]: E0320 07:15:42.177081 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:44 crc kubenswrapper[4749]: I0320 07:15:44.176368 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:44 crc kubenswrapper[4749]: I0320 07:15:44.176394 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:44 crc kubenswrapper[4749]: I0320 07:15:44.176443 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:44 crc kubenswrapper[4749]: E0320 07:15:44.177629 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:44 crc kubenswrapper[4749]: I0320 07:15:44.177852 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:44 crc kubenswrapper[4749]: E0320 07:15:44.178087 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:44 crc kubenswrapper[4749]: E0320 07:15:44.178234 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:44 crc kubenswrapper[4749]: E0320 07:15:44.178372 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:44 crc kubenswrapper[4749]: E0320 07:15:44.748783 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 07:15:46 crc kubenswrapper[4749]: I0320 07:15:46.176450 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:46 crc kubenswrapper[4749]: E0320 07:15:46.178183 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:46 crc kubenswrapper[4749]: I0320 07:15:46.176534 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:46 crc kubenswrapper[4749]: E0320 07:15:46.178635 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:46 crc kubenswrapper[4749]: I0320 07:15:46.176523 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:46 crc kubenswrapper[4749]: I0320 07:15:46.176675 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:46 crc kubenswrapper[4749]: E0320 07:15:46.179367 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:46 crc kubenswrapper[4749]: E0320 07:15:46.179374 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:48 crc kubenswrapper[4749]: I0320 07:15:48.176300 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:48 crc kubenswrapper[4749]: E0320 07:15:48.176430 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:48 crc kubenswrapper[4749]: I0320 07:15:48.176611 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:48 crc kubenswrapper[4749]: E0320 07:15:48.176658 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:48 crc kubenswrapper[4749]: I0320 07:15:48.176759 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:48 crc kubenswrapper[4749]: E0320 07:15:48.176801 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:48 crc kubenswrapper[4749]: I0320 07:15:48.176904 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:48 crc kubenswrapper[4749]: E0320 07:15:48.176951 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:49 crc kubenswrapper[4749]: E0320 07:15:49.749984 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 07:15:50 crc kubenswrapper[4749]: I0320 07:15:50.177195 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:50 crc kubenswrapper[4749]: I0320 07:15:50.177250 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:50 crc kubenswrapper[4749]: I0320 07:15:50.177253 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:50 crc kubenswrapper[4749]: I0320 07:15:50.177213 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:50 crc kubenswrapper[4749]: E0320 07:15:50.177400 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:50 crc kubenswrapper[4749]: E0320 07:15:50.177496 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:50 crc kubenswrapper[4749]: E0320 07:15:50.177580 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:50 crc kubenswrapper[4749]: E0320 07:15:50.177646 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:51 crc kubenswrapper[4749]: I0320 07:15:51.177447 4749 scope.go:117] "RemoveContainer" containerID="b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9" Mar 20 07:15:51 crc kubenswrapper[4749]: I0320 07:15:51.870919 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdgcw_2153d97b-a108-49f8-b6c8-8223ea65b878/ovnkube-controller/3.log" Mar 20 07:15:51 crc kubenswrapper[4749]: I0320 07:15:51.873465 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerStarted","Data":"42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84"} Mar 20 07:15:51 crc kubenswrapper[4749]: I0320 07:15:51.873990 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:15:51 crc kubenswrapper[4749]: I0320 07:15:51.913830 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" podStartSLOduration=130.91380593 podStartE2EDuration="2m10.91380593s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:15:51.913437459 +0000 UTC m=+188.463095176" watchObservedRunningTime="2026-03-20 07:15:51.91380593 +0000 UTC m=+188.463463607" Mar 20 07:15:51 crc kubenswrapper[4749]: I0320 07:15:51.940446 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-k56zh"] Mar 20 07:15:51 crc kubenswrapper[4749]: I0320 07:15:51.940592 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:51 crc kubenswrapper[4749]: E0320 07:15:51.940725 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:52 crc kubenswrapper[4749]: I0320 07:15:52.176645 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:52 crc kubenswrapper[4749]: I0320 07:15:52.177043 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:52 crc kubenswrapper[4749]: I0320 07:15:52.177072 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:52 crc kubenswrapper[4749]: E0320 07:15:52.177142 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:52 crc kubenswrapper[4749]: E0320 07:15:52.177299 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:52 crc kubenswrapper[4749]: E0320 07:15:52.177372 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:53 crc kubenswrapper[4749]: I0320 07:15:53.177073 4749 scope.go:117] "RemoveContainer" containerID="290c8178fc52bf0ce040051ac3f6e31f5f5245203c3a61c98c6a723710fbb94b" Mar 20 07:15:53 crc kubenswrapper[4749]: I0320 07:15:53.889525 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rcq9v_3f813da7-84d4-4550-ad66-f282814444a3/kube-multus/1.log" Mar 20 07:15:53 crc kubenswrapper[4749]: I0320 07:15:53.889604 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rcq9v" event={"ID":"3f813da7-84d4-4550-ad66-f282814444a3","Type":"ContainerStarted","Data":"a472c3325b9b11a217ab5fe9ec06f916f27c82ae9b673a67e50a89cf56598aeb"} Mar 20 07:15:54 crc kubenswrapper[4749]: I0320 07:15:54.176913 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:54 crc kubenswrapper[4749]: I0320 07:15:54.176976 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:54 crc kubenswrapper[4749]: I0320 07:15:54.176995 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:54 crc kubenswrapper[4749]: E0320 07:15:54.178717 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:54 crc kubenswrapper[4749]: I0320 07:15:54.178816 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:54 crc kubenswrapper[4749]: E0320 07:15:54.179019 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:54 crc kubenswrapper[4749]: E0320 07:15:54.179113 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:54 crc kubenswrapper[4749]: E0320 07:15:54.179236 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:54 crc kubenswrapper[4749]: E0320 07:15:54.751020 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 07:15:56 crc kubenswrapper[4749]: I0320 07:15:56.179223 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:56 crc kubenswrapper[4749]: I0320 07:15:56.179293 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:56 crc kubenswrapper[4749]: I0320 07:15:56.179346 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:56 crc kubenswrapper[4749]: E0320 07:15:56.179868 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:56 crc kubenswrapper[4749]: E0320 07:15:56.179672 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:56 crc kubenswrapper[4749]: I0320 07:15:56.179366 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:56 crc kubenswrapper[4749]: E0320 07:15:56.179967 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:15:56 crc kubenswrapper[4749]: E0320 07:15:56.180231 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:58 crc kubenswrapper[4749]: I0320 07:15:58.176688 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:15:58 crc kubenswrapper[4749]: I0320 07:15:58.176754 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:15:58 crc kubenswrapper[4749]: I0320 07:15:58.176713 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:15:58 crc kubenswrapper[4749]: E0320 07:15:58.176853 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 07:15:58 crc kubenswrapper[4749]: I0320 07:15:58.176939 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:15:58 crc kubenswrapper[4749]: E0320 07:15:58.176973 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 07:15:58 crc kubenswrapper[4749]: E0320 07:15:58.177089 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 07:15:58 crc kubenswrapper[4749]: E0320 07:15:58.177244 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k56zh" podUID="6d19b89e-d048-4656-b5ce-c637190ab678" Mar 20 07:16:00 crc kubenswrapper[4749]: I0320 07:16:00.176277 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:16:00 crc kubenswrapper[4749]: I0320 07:16:00.176379 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:16:00 crc kubenswrapper[4749]: I0320 07:16:00.176449 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:16:00 crc kubenswrapper[4749]: I0320 07:16:00.176271 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:16:00 crc kubenswrapper[4749]: I0320 07:16:00.178453 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 07:16:00 crc kubenswrapper[4749]: I0320 07:16:00.178536 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 07:16:00 crc kubenswrapper[4749]: I0320 07:16:00.178620 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 07:16:00 crc kubenswrapper[4749]: I0320 07:16:00.178959 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 07:16:00 crc kubenswrapper[4749]: I0320 07:16:00.180476 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 07:16:00 crc kubenswrapper[4749]: I0320 07:16:00.181032 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 07:16:04 crc kubenswrapper[4749]: I0320 07:16:04.515223 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:16:04 crc kubenswrapper[4749]: I0320 07:16:04.515343 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:16:04 crc kubenswrapper[4749]: I0320 07:16:04.691873 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.135905 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.188348 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-q5gk7"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.189101 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5gk7" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.190804 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.191137 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.191795 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z5lbj"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.192223 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z5lbj" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.196895 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.197019 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hvf29"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.197164 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.197400 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.197527 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.197684 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.197794 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.197905 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.198012 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.198128 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.198394 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.198521 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.201609 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-9gg28"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.201906 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.202359 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gbwxf"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.202854 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-t5b5l"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.202914 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.203234 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bjx8c"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.203696 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-952x2"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.203845 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gbwxf" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.203906 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.204090 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.204573 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bjx8c" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.204622 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-952x2" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.205341 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.217478 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.238132 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.240184 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.240639 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.240798 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.241027 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.241326 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.241782 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.241959 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.242143 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.242233 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.242349 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.241995 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.242542 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.242160 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bsgpp"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.242202 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.243028 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.243153 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.243221 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.243344 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-2zlqs"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.243512 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bsgpp" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.243789 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.243389 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.243448 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.243823 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.244042 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.244133 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.244427 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-64rv4"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.245339 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-64rv4" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.245867 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.248542 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.248610 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.248886 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.248973 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.249025 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.249146 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.249227 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.249327 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.249392 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.249520 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.249588 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.249679 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.249748 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.249776 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.249814 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.249878 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.249912 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.249975 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.250052 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.250175 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.250317 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.250337 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.248552 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.248585 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.250748 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.250872 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.250960 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.255333 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jbqrm"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.256120 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jbqrm" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.257703 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.258542 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.259749 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zz6kk"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.260309 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.261078 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.261568 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.261691 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.261808 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.261845 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.261817 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.262245 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.262474 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.262556 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.262492 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.262713 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.262813 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.262943 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.263105 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.263976 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.264525 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.265786 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.266006 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.289409 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbrk6"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.290331 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-blksh"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.313585 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.314208 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.314409 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbrk6" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.314968 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-vmnvn"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.315334 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-vmnvn" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.315522 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-blksh" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.316311 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.316536 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.316630 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lwsmc"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.317102 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lwsmc" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.317204 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.317605 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.317748 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.317794 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318003 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318009 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318380 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/815dbd4c-68ea-43e3-a355-1658ccdccd22-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-z5lbj\" (UID: \"815dbd4c-68ea-43e3-a355-1658ccdccd22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z5lbj" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318408 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318435 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318455 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e222e7a0-549c-46a7-8ee6-484dd2160be4-serving-cert\") pod \"apiserver-7bbb656c7d-zznkl\" (UID: \"e222e7a0-549c-46a7-8ee6-484dd2160be4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318474 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318494 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d999d3d0-14e4-4759-98ab-a6d11011ca86-encryption-config\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318515 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/309f1b8f-63a4-4019-b8f0-500dc7b60c8d-serving-cert\") pod \"openshift-config-operator-7777fb866f-q5gk7\" (UID: \"309f1b8f-63a4-4019-b8f0-500dc7b60c8d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5gk7" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318531 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d999d3d0-14e4-4759-98ab-a6d11011ca86-audit\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318549 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318578 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d999d3d0-14e4-4759-98ab-a6d11011ca86-audit-dir\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318597 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/815dbd4c-68ea-43e3-a355-1658ccdccd22-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-z5lbj\" (UID: \"815dbd4c-68ea-43e3-a355-1658ccdccd22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z5lbj" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318614 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd1003fd-4300-423c-b500-e782a8aeb7bb-config\") pod \"controller-manager-879f6c89f-hvf29\" (UID: \"bd1003fd-4300-423c-b500-e782a8aeb7bb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318628 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8sp2\" (UniqueName: \"kubernetes.io/projected/d1e24d59-bf58-421f-81a7-cc04d151fdd5-kube-api-access-b8sp2\") pod \"cluster-samples-operator-665b6dd947-gbwxf\" (UID: \"d1e24d59-bf58-421f-81a7-cc04d151fdd5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gbwxf" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318643 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/38b3f23d-6db5-4788-bcd5-810450677cd6-audit-dir\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318659 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d999d3d0-14e4-4759-98ab-a6d11011ca86-image-import-ca\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318677 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e222e7a0-549c-46a7-8ee6-484dd2160be4-encryption-config\") pod \"apiserver-7bbb656c7d-zznkl\" (UID: \"e222e7a0-549c-46a7-8ee6-484dd2160be4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318697 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318719 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318739 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d999d3d0-14e4-4759-98ab-a6d11011ca86-etcd-client\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318759 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318777 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d999d3d0-14e4-4759-98ab-a6d11011ca86-config\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318798 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e222e7a0-549c-46a7-8ee6-484dd2160be4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zznkl\" (UID: \"e222e7a0-549c-46a7-8ee6-484dd2160be4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318818 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a67741d8-5ced-477f-ac4c-0f7fc736f363-serving-cert\") pod \"authentication-operator-69f744f599-bjx8c\" (UID: \"a67741d8-5ced-477f-ac4c-0f7fc736f363\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjx8c" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318832 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d1e24d59-bf58-421f-81a7-cc04d151fdd5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gbwxf\" (UID: \"d1e24d59-bf58-421f-81a7-cc04d151fdd5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gbwxf" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318846 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zbmn\" (UniqueName: \"kubernetes.io/projected/a67741d8-5ced-477f-ac4c-0f7fc736f363-kube-api-access-4zbmn\") pod \"authentication-operator-69f744f599-bjx8c\" (UID: \"a67741d8-5ced-477f-ac4c-0f7fc736f363\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjx8c" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318862 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd1003fd-4300-423c-b500-e782a8aeb7bb-serving-cert\") pod \"controller-manager-879f6c89f-hvf29\" (UID: \"bd1003fd-4300-423c-b500-e782a8aeb7bb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318878 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899-config\") pod \"route-controller-manager-6576b87f9c-jvtqf\" (UID: \"4f9b9110-5f21-4d71-ac4e-61e0ff6b1899\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318894 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318908 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d999d3d0-14e4-4759-98ab-a6d11011ca86-etcd-serving-ca\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318922 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d999d3d0-14e4-4759-98ab-a6d11011ca86-trusted-ca-bundle\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318937 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/309f1b8f-63a4-4019-b8f0-500dc7b60c8d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-q5gk7\" (UID: \"309f1b8f-63a4-4019-b8f0-500dc7b60c8d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5gk7" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318952 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d999d3d0-14e4-4759-98ab-a6d11011ca86-serving-cert\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318968 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a67741d8-5ced-477f-ac4c-0f7fc736f363-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bjx8c\" (UID: \"a67741d8-5ced-477f-ac4c-0f7fc736f363\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjx8c" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.318985 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319000 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gznf6\" (UniqueName: \"kubernetes.io/projected/38b3f23d-6db5-4788-bcd5-810450677cd6-kube-api-access-gznf6\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319015 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd1003fd-4300-423c-b500-e782a8aeb7bb-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hvf29\" (UID: \"bd1003fd-4300-423c-b500-e782a8aeb7bb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319033 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/815dbd4c-68ea-43e3-a355-1658ccdccd22-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-z5lbj\" (UID: \"815dbd4c-68ea-43e3-a355-1658ccdccd22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z5lbj" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319048 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e222e7a0-549c-46a7-8ee6-484dd2160be4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zznkl\" (UID: \"e222e7a0-549c-46a7-8ee6-484dd2160be4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319069 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-przch\" (UniqueName: \"kubernetes.io/projected/95e4555b-7f8b-4297-bed6-e0cf5e90ea3e-kube-api-access-przch\") pod \"downloads-7954f5f757-952x2\" (UID: \"95e4555b-7f8b-4297-bed6-e0cf5e90ea3e\") " pod="openshift-console/downloads-7954f5f757-952x2" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319083 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a67741d8-5ced-477f-ac4c-0f7fc736f363-config\") pod \"authentication-operator-69f744f599-bjx8c\" (UID: \"a67741d8-5ced-477f-ac4c-0f7fc736f363\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjx8c" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319096 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/38b3f23d-6db5-4788-bcd5-810450677cd6-audit-policies\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319114 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrjz4\" (UniqueName: \"kubernetes.io/projected/bd1003fd-4300-423c-b500-e782a8aeb7bb-kube-api-access-wrjz4\") pod \"controller-manager-879f6c89f-hvf29\" (UID: \"bd1003fd-4300-423c-b500-e782a8aeb7bb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319133 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7442t\" (UniqueName: \"kubernetes.io/projected/d999d3d0-14e4-4759-98ab-a6d11011ca86-kube-api-access-7442t\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319149 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p24s\" (UniqueName: \"kubernetes.io/projected/815dbd4c-68ea-43e3-a355-1658ccdccd22-kube-api-access-7p24s\") pod \"cluster-image-registry-operator-dc59b4c8b-z5lbj\" (UID: \"815dbd4c-68ea-43e3-a355-1658ccdccd22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z5lbj" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319152 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j79zb"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319164 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-742n8\" (UniqueName: \"kubernetes.io/projected/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899-kube-api-access-742n8\") pod \"route-controller-manager-6576b87f9c-jvtqf\" (UID: \"4f9b9110-5f21-4d71-ac4e-61e0ff6b1899\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319179 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrp6c\" (UniqueName: \"kubernetes.io/projected/309f1b8f-63a4-4019-b8f0-500dc7b60c8d-kube-api-access-mrp6c\") pod \"openshift-config-operator-7777fb866f-q5gk7\" (UID: \"309f1b8f-63a4-4019-b8f0-500dc7b60c8d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5gk7" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319196 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e222e7a0-549c-46a7-8ee6-484dd2160be4-audit-policies\") pod \"apiserver-7bbb656c7d-zznkl\" (UID: \"e222e7a0-549c-46a7-8ee6-484dd2160be4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319219 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899-serving-cert\") pod \"route-controller-manager-6576b87f9c-jvtqf\" (UID: \"4f9b9110-5f21-4d71-ac4e-61e0ff6b1899\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319232 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e222e7a0-549c-46a7-8ee6-484dd2160be4-etcd-client\") pod \"apiserver-7bbb656c7d-zznkl\" (UID: \"e222e7a0-549c-46a7-8ee6-484dd2160be4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319254 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd1003fd-4300-423c-b500-e782a8aeb7bb-client-ca\") pod \"controller-manager-879f6c89f-hvf29\" (UID: \"bd1003fd-4300-423c-b500-e782a8aeb7bb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319268 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d999d3d0-14e4-4759-98ab-a6d11011ca86-node-pullsecrets\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319297 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79q4p\" (UniqueName: \"kubernetes.io/projected/e222e7a0-549c-46a7-8ee6-484dd2160be4-kube-api-access-79q4p\") pod \"apiserver-7bbb656c7d-zznkl\" (UID: \"e222e7a0-549c-46a7-8ee6-484dd2160be4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319313 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e222e7a0-549c-46a7-8ee6-484dd2160be4-audit-dir\") pod \"apiserver-7bbb656c7d-zznkl\" (UID: \"e222e7a0-549c-46a7-8ee6-484dd2160be4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319330 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a67741d8-5ced-477f-ac4c-0f7fc736f363-service-ca-bundle\") pod \"authentication-operator-69f744f599-bjx8c\" (UID: \"a67741d8-5ced-477f-ac4c-0f7fc736f363\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjx8c" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319346 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319361 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319384 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899-client-ca\") pod \"route-controller-manager-6576b87f9c-jvtqf\" (UID: \"4f9b9110-5f21-4d71-ac4e-61e0ff6b1899\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319655 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.319819 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.320040 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nv7hv"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.326958 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.327217 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.328205 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.328550 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-dbs7t"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.329768 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gd9xh"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.330019 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bqpst"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.328614 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nv7hv" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.328748 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.328749 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.328777 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.330248 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-dbs7t" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.328801 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.330305 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gd9xh" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.328852 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.328882 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.328916 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.330738 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9ncz"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.330857 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bqpst" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.331018 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-s629z"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.331181 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9ncz" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.331374 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-lzpsv"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.331556 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s629z" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.331747 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566516-6stbk"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.332025 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566516-6stbk" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.332153 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-lzpsv" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.332264 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n2fsv"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.335093 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.336458 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2rp65"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.336917 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hlw9w"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.337028 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n2fsv" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.337397 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hlw9w" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.337518 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2rp65" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.340645 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ldf7s"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.341137 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-q5gk7"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.341218 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ldf7s" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.342130 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6sv2z"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.342588 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6sv2z" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.342918 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-l285h"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.343478 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-l285h" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.345323 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566515-6hw5x"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.345757 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvdhs"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.346147 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvdhs" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.346370 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-6hw5x" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.349511 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jfhjj"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.350044 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-jfhjj" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.350138 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pwjp"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.350795 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pwjp" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.351731 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-phw2k"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.364172 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.366191 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-phw2k" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.371007 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.375792 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-49gfv"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.376738 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-49gfv" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.379128 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j79zb"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.381149 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-blksh"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.382586 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbrk6"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.385559 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bqpst"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.387101 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gbwxf"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.390685 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.390757 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z5lbj"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.392396 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.397306 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hvf29"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.398679 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.402956 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-952x2"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.407424 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-9gg28"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.407488 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-t5b5l"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.412747 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-vmnvn"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.412804 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-s629z"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.412816 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jbqrm"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.414735 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.420632 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p24s\" (UniqueName: \"kubernetes.io/projected/815dbd4c-68ea-43e3-a355-1658ccdccd22-kube-api-access-7p24s\") pod \"cluster-image-registry-operator-dc59b4c8b-z5lbj\" (UID: \"815dbd4c-68ea-43e3-a355-1658ccdccd22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z5lbj" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.420671 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/214938de-72be-4416-84ab-c12591ef2c68-serving-cert\") pod \"console-operator-58897d9998-bqpst\" (UID: \"214938de-72be-4416-84ab-c12591ef2c68\") " pod="openshift-console-operator/console-operator-58897d9998-bqpst" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.420691 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-742n8\" (UniqueName: \"kubernetes.io/projected/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899-kube-api-access-742n8\") pod \"route-controller-manager-6576b87f9c-jvtqf\" (UID: \"4f9b9110-5f21-4d71-ac4e-61e0ff6b1899\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.420728 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrp6c\" (UniqueName: \"kubernetes.io/projected/309f1b8f-63a4-4019-b8f0-500dc7b60c8d-kube-api-access-mrp6c\") pod \"openshift-config-operator-7777fb866f-q5gk7\" (UID: \"309f1b8f-63a4-4019-b8f0-500dc7b60c8d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5gk7" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.420748 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e222e7a0-549c-46a7-8ee6-484dd2160be4-audit-policies\") pod \"apiserver-7bbb656c7d-zznkl\" (UID: \"e222e7a0-549c-46a7-8ee6-484dd2160be4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.420775 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899-serving-cert\") pod \"route-controller-manager-6576b87f9c-jvtqf\" (UID: \"4f9b9110-5f21-4d71-ac4e-61e0ff6b1899\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.420789 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e222e7a0-549c-46a7-8ee6-484dd2160be4-etcd-client\") pod \"apiserver-7bbb656c7d-zznkl\" (UID: \"e222e7a0-549c-46a7-8ee6-484dd2160be4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.420810 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd1003fd-4300-423c-b500-e782a8aeb7bb-client-ca\") pod \"controller-manager-879f6c89f-hvf29\" (UID: \"bd1003fd-4300-423c-b500-e782a8aeb7bb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.420826 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d999d3d0-14e4-4759-98ab-a6d11011ca86-node-pullsecrets\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.420840 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79q4p\" (UniqueName: \"kubernetes.io/projected/e222e7a0-549c-46a7-8ee6-484dd2160be4-kube-api-access-79q4p\") pod \"apiserver-7bbb656c7d-zznkl\" (UID: \"e222e7a0-549c-46a7-8ee6-484dd2160be4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.420897 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2lrx\" (UniqueName: \"kubernetes.io/projected/b32a129b-0c90-4d06-87d5-fd7e70b726e5-kube-api-access-c2lrx\") pod \"machine-config-controller-84d6567774-nv7hv\" (UID: \"b32a129b-0c90-4d06-87d5-fd7e70b726e5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nv7hv" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.420913 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2a05065-734d-4884-b037-c54ab87609eb-metrics-tls\") pod \"dns-operator-744455d44c-lzpsv\" (UID: \"e2a05065-734d-4884-b037-c54ab87609eb\") " pod="openshift-dns-operator/dns-operator-744455d44c-lzpsv" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.420931 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/da30337a-26c0-4b0b-beb5-c46c48facfc6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-6sv2z\" (UID: \"da30337a-26c0-4b0b-beb5-c46c48facfc6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6sv2z" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.420948 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e222e7a0-549c-46a7-8ee6-484dd2160be4-audit-dir\") pod \"apiserver-7bbb656c7d-zznkl\" (UID: \"e222e7a0-549c-46a7-8ee6-484dd2160be4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.420965 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/954f5aa5-a05b-44bf-8642-e58746d21984-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-blksh\" (UID: \"954f5aa5-a05b-44bf-8642-e58746d21984\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-blksh" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.420986 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a67741d8-5ced-477f-ac4c-0f7fc736f363-service-ca-bundle\") pod \"authentication-operator-69f744f599-bjx8c\" (UID: \"a67741d8-5ced-477f-ac4c-0f7fc736f363\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjx8c" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421004 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421021 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421047 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899-client-ca\") pod \"route-controller-manager-6576b87f9c-jvtqf\" (UID: \"4f9b9110-5f21-4d71-ac4e-61e0ff6b1899\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421063 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/815dbd4c-68ea-43e3-a355-1658ccdccd22-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-z5lbj\" (UID: \"815dbd4c-68ea-43e3-a355-1658ccdccd22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z5lbj" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421101 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421118 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421134 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e222e7a0-549c-46a7-8ee6-484dd2160be4-serving-cert\") pod \"apiserver-7bbb656c7d-zznkl\" (UID: \"e222e7a0-549c-46a7-8ee6-484dd2160be4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421149 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421171 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d999d3d0-14e4-4759-98ab-a6d11011ca86-encryption-config\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421191 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs7s9\" (UniqueName: \"kubernetes.io/projected/da30337a-26c0-4b0b-beb5-c46c48facfc6-kube-api-access-qs7s9\") pod \"control-plane-machine-set-operator-78cbb6b69f-6sv2z\" (UID: \"da30337a-26c0-4b0b-beb5-c46c48facfc6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6sv2z" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421209 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/309f1b8f-63a4-4019-b8f0-500dc7b60c8d-serving-cert\") pod \"openshift-config-operator-7777fb866f-q5gk7\" (UID: \"309f1b8f-63a4-4019-b8f0-500dc7b60c8d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5gk7" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421225 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d999d3d0-14e4-4759-98ab-a6d11011ca86-audit\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421242 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b32a129b-0c90-4d06-87d5-fd7e70b726e5-proxy-tls\") pod \"machine-config-controller-84d6567774-nv7hv\" (UID: \"b32a129b-0c90-4d06-87d5-fd7e70b726e5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nv7hv" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421261 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421276 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/954f5aa5-a05b-44bf-8642-e58746d21984-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-blksh\" (UID: \"954f5aa5-a05b-44bf-8642-e58746d21984\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-blksh" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421304 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s4gj\" (UniqueName: \"kubernetes.io/projected/214938de-72be-4416-84ab-c12591ef2c68-kube-api-access-9s4gj\") pod \"console-operator-58897d9998-bqpst\" (UID: \"214938de-72be-4416-84ab-c12591ef2c68\") " pod="openshift-console-operator/console-operator-58897d9998-bqpst" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421319 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p847c\" (UniqueName: \"kubernetes.io/projected/16e1bdb5-47b7-40e0-bc4b-cdd87976f461-kube-api-access-p847c\") pod \"multus-admission-controller-857f4d67dd-hlw9w\" (UID: \"16e1bdb5-47b7-40e0-bc4b-cdd87976f461\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hlw9w" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421345 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d999d3d0-14e4-4759-98ab-a6d11011ca86-audit-dir\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421364 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/815dbd4c-68ea-43e3-a355-1658ccdccd22-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-z5lbj\" (UID: \"815dbd4c-68ea-43e3-a355-1658ccdccd22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z5lbj" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421382 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd1003fd-4300-423c-b500-e782a8aeb7bb-config\") pod \"controller-manager-879f6c89f-hvf29\" (UID: \"bd1003fd-4300-423c-b500-e782a8aeb7bb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421399 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8sp2\" (UniqueName: \"kubernetes.io/projected/d1e24d59-bf58-421f-81a7-cc04d151fdd5-kube-api-access-b8sp2\") pod \"cluster-samples-operator-665b6dd947-gbwxf\" (UID: \"d1e24d59-bf58-421f-81a7-cc04d151fdd5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gbwxf" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421430 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214938de-72be-4416-84ab-c12591ef2c68-config\") pod \"console-operator-58897d9998-bqpst\" (UID: \"214938de-72be-4416-84ab-c12591ef2c68\") " pod="openshift-console-operator/console-operator-58897d9998-bqpst" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421446 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/16e1bdb5-47b7-40e0-bc4b-cdd87976f461-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hlw9w\" (UID: \"16e1bdb5-47b7-40e0-bc4b-cdd87976f461\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hlw9w" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421487 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e222e7a0-549c-46a7-8ee6-484dd2160be4-encryption-config\") pod \"apiserver-7bbb656c7d-zznkl\" (UID: \"e222e7a0-549c-46a7-8ee6-484dd2160be4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421504 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/38b3f23d-6db5-4788-bcd5-810450677cd6-audit-dir\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421518 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d999d3d0-14e4-4759-98ab-a6d11011ca86-image-import-ca\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421534 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421562 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421577 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d999d3d0-14e4-4759-98ab-a6d11011ca86-etcd-client\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421593 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/214938de-72be-4416-84ab-c12591ef2c68-trusted-ca\") pod \"console-operator-58897d9998-bqpst\" (UID: \"214938de-72be-4416-84ab-c12591ef2c68\") " pod="openshift-console-operator/console-operator-58897d9998-bqpst" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421609 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421636 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d999d3d0-14e4-4759-98ab-a6d11011ca86-config\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421655 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/954f5aa5-a05b-44bf-8642-e58746d21984-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-blksh\" (UID: \"954f5aa5-a05b-44bf-8642-e58746d21984\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-blksh" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421669 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj2nb\" (UniqueName: \"kubernetes.io/projected/e2a05065-734d-4884-b037-c54ab87609eb-kube-api-access-qj2nb\") pod \"dns-operator-744455d44c-lzpsv\" (UID: \"e2a05065-734d-4884-b037-c54ab87609eb\") " pod="openshift-dns-operator/dns-operator-744455d44c-lzpsv" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421685 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e222e7a0-549c-46a7-8ee6-484dd2160be4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zznkl\" (UID: \"e222e7a0-549c-46a7-8ee6-484dd2160be4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421701 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a67741d8-5ced-477f-ac4c-0f7fc736f363-serving-cert\") pod \"authentication-operator-69f744f599-bjx8c\" (UID: \"a67741d8-5ced-477f-ac4c-0f7fc736f363\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjx8c" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421716 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d1e24d59-bf58-421f-81a7-cc04d151fdd5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gbwxf\" (UID: \"d1e24d59-bf58-421f-81a7-cc04d151fdd5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gbwxf" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421735 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zbmn\" (UniqueName: \"kubernetes.io/projected/a67741d8-5ced-477f-ac4c-0f7fc736f363-kube-api-access-4zbmn\") pod \"authentication-operator-69f744f599-bjx8c\" (UID: \"a67741d8-5ced-477f-ac4c-0f7fc736f363\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjx8c" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421750 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd1003fd-4300-423c-b500-e782a8aeb7bb-serving-cert\") pod \"controller-manager-879f6c89f-hvf29\" (UID: \"bd1003fd-4300-423c-b500-e782a8aeb7bb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421766 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421782 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899-config\") pod \"route-controller-manager-6576b87f9c-jvtqf\" (UID: \"4f9b9110-5f21-4d71-ac4e-61e0ff6b1899\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421798 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/309f1b8f-63a4-4019-b8f0-500dc7b60c8d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-q5gk7\" (UID: \"309f1b8f-63a4-4019-b8f0-500dc7b60c8d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5gk7" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421814 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d999d3d0-14e4-4759-98ab-a6d11011ca86-etcd-serving-ca\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421828 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d999d3d0-14e4-4759-98ab-a6d11011ca86-trusted-ca-bundle\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421844 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d999d3d0-14e4-4759-98ab-a6d11011ca86-serving-cert\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421859 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b32a129b-0c90-4d06-87d5-fd7e70b726e5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nv7hv\" (UID: \"b32a129b-0c90-4d06-87d5-fd7e70b726e5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nv7hv" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421882 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a67741d8-5ced-477f-ac4c-0f7fc736f363-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bjx8c\" (UID: \"a67741d8-5ced-477f-ac4c-0f7fc736f363\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjx8c" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421898 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421913 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gznf6\" (UniqueName: \"kubernetes.io/projected/38b3f23d-6db5-4788-bcd5-810450677cd6-kube-api-access-gznf6\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421927 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd1003fd-4300-423c-b500-e782a8aeb7bb-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hvf29\" (UID: \"bd1003fd-4300-423c-b500-e782a8aeb7bb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421943 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/815dbd4c-68ea-43e3-a355-1658ccdccd22-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-z5lbj\" (UID: \"815dbd4c-68ea-43e3-a355-1658ccdccd22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z5lbj" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421958 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e222e7a0-549c-46a7-8ee6-484dd2160be4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zznkl\" (UID: \"e222e7a0-549c-46a7-8ee6-484dd2160be4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421974 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-przch\" (UniqueName: \"kubernetes.io/projected/95e4555b-7f8b-4297-bed6-e0cf5e90ea3e-kube-api-access-przch\") pod \"downloads-7954f5f757-952x2\" (UID: \"95e4555b-7f8b-4297-bed6-e0cf5e90ea3e\") " pod="openshift-console/downloads-7954f5f757-952x2" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.421990 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a67741d8-5ced-477f-ac4c-0f7fc736f363-config\") pod \"authentication-operator-69f744f599-bjx8c\" (UID: \"a67741d8-5ced-477f-ac4c-0f7fc736f363\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjx8c" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.422003 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/38b3f23d-6db5-4788-bcd5-810450677cd6-audit-policies\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.422019 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrjz4\" (UniqueName: \"kubernetes.io/projected/bd1003fd-4300-423c-b500-e782a8aeb7bb-kube-api-access-wrjz4\") pod \"controller-manager-879f6c89f-hvf29\" (UID: \"bd1003fd-4300-423c-b500-e782a8aeb7bb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.422038 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7442t\" (UniqueName: \"kubernetes.io/projected/d999d3d0-14e4-4759-98ab-a6d11011ca86-kube-api-access-7442t\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.423663 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566516-6stbk"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.424426 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n2fsv"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.424435 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/815dbd4c-68ea-43e3-a355-1658ccdccd22-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-z5lbj\" (UID: \"815dbd4c-68ea-43e3-a355-1658ccdccd22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z5lbj" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.423721 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899-client-ca\") pod \"route-controller-manager-6576b87f9c-jvtqf\" (UID: \"4f9b9110-5f21-4d71-ac4e-61e0ff6b1899\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.424970 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d999d3d0-14e4-4759-98ab-a6d11011ca86-audit-dir\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.425049 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e222e7a0-549c-46a7-8ee6-484dd2160be4-audit-policies\") pod \"apiserver-7bbb656c7d-zznkl\" (UID: \"e222e7a0-549c-46a7-8ee6-484dd2160be4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.425715 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.425744 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d999d3d0-14e4-4759-98ab-a6d11011ca86-node-pullsecrets\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.426079 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e222e7a0-549c-46a7-8ee6-484dd2160be4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zznkl\" (UID: \"e222e7a0-549c-46a7-8ee6-484dd2160be4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.426082 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.426794 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e222e7a0-549c-46a7-8ee6-484dd2160be4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zznkl\" (UID: \"e222e7a0-549c-46a7-8ee6-484dd2160be4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.426838 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a67741d8-5ced-477f-ac4c-0f7fc736f363-config\") pod \"authentication-operator-69f744f599-bjx8c\" (UID: \"a67741d8-5ced-477f-ac4c-0f7fc736f363\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjx8c" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.426860 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nv7hv"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.427780 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/38b3f23d-6db5-4788-bcd5-810450677cd6-audit-policies\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.428196 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e222e7a0-549c-46a7-8ee6-484dd2160be4-audit-dir\") pod \"apiserver-7bbb656c7d-zznkl\" (UID: \"e222e7a0-549c-46a7-8ee6-484dd2160be4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.428850 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a67741d8-5ced-477f-ac4c-0f7fc736f363-service-ca-bundle\") pod \"authentication-operator-69f744f599-bjx8c\" (UID: \"a67741d8-5ced-477f-ac4c-0f7fc736f363\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjx8c" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.429103 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/38b3f23d-6db5-4788-bcd5-810450677cd6-audit-dir\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.431024 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d999d3d0-14e4-4759-98ab-a6d11011ca86-audit\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.431410 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e222e7a0-549c-46a7-8ee6-484dd2160be4-serving-cert\") pod \"apiserver-7bbb656c7d-zznkl\" (UID: \"e222e7a0-549c-46a7-8ee6-484dd2160be4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.431787 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.431918 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd1003fd-4300-423c-b500-e782a8aeb7bb-config\") pod \"controller-manager-879f6c89f-hvf29\" (UID: \"bd1003fd-4300-423c-b500-e782a8aeb7bb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.431923 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d999d3d0-14e4-4759-98ab-a6d11011ca86-config\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.431982 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899-serving-cert\") pod \"route-controller-manager-6576b87f9c-jvtqf\" (UID: \"4f9b9110-5f21-4d71-ac4e-61e0ff6b1899\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.431797 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d999d3d0-14e4-4759-98ab-a6d11011ca86-trusted-ca-bundle\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.432258 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.432840 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.433036 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd1003fd-4300-423c-b500-e782a8aeb7bb-client-ca\") pod \"controller-manager-879f6c89f-hvf29\" (UID: \"bd1003fd-4300-423c-b500-e782a8aeb7bb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.433720 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/309f1b8f-63a4-4019-b8f0-500dc7b60c8d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-q5gk7\" (UID: \"309f1b8f-63a4-4019-b8f0-500dc7b60c8d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5gk7" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.433964 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d999d3d0-14e4-4759-98ab-a6d11011ca86-etcd-serving-ca\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.434198 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-l285h"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.434511 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd1003fd-4300-423c-b500-e782a8aeb7bb-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hvf29\" (UID: \"bd1003fd-4300-423c-b500-e782a8aeb7bb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.436393 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pwjp"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.436602 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d999d3d0-14e4-4759-98ab-a6d11011ca86-image-import-ca\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.436653 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a67741d8-5ced-477f-ac4c-0f7fc736f363-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bjx8c\" (UID: \"a67741d8-5ced-477f-ac4c-0f7fc736f363\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjx8c" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.436751 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/309f1b8f-63a4-4019-b8f0-500dc7b60c8d-serving-cert\") pod \"openshift-config-operator-7777fb866f-q5gk7\" (UID: \"309f1b8f-63a4-4019-b8f0-500dc7b60c8d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5gk7" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.437009 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/815dbd4c-68ea-43e3-a355-1658ccdccd22-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-z5lbj\" (UID: \"815dbd4c-68ea-43e3-a355-1658ccdccd22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z5lbj" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.437368 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd1003fd-4300-423c-b500-e782a8aeb7bb-serving-cert\") pod \"controller-manager-879f6c89f-hvf29\" (UID: \"bd1003fd-4300-423c-b500-e782a8aeb7bb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.437699 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d999d3d0-14e4-4759-98ab-a6d11011ca86-encryption-config\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.438181 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bjx8c"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.438561 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d999d3d0-14e4-4759-98ab-a6d11011ca86-etcd-client\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.439507 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lwsmc"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.440857 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-t8g8d"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.443175 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9ncz"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.443257 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t8g8d" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.444588 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.444704 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e222e7a0-549c-46a7-8ee6-484dd2160be4-etcd-client\") pod \"apiserver-7bbb656c7d-zznkl\" (UID: \"e222e7a0-549c-46a7-8ee6-484dd2160be4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.444740 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.445058 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d999d3d0-14e4-4759-98ab-a6d11011ca86-serving-cert\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.445366 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899-config\") pod \"route-controller-manager-6576b87f9c-jvtqf\" (UID: \"4f9b9110-5f21-4d71-ac4e-61e0ff6b1899\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.445638 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-lzpsv"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.445791 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e222e7a0-549c-46a7-8ee6-484dd2160be4-encryption-config\") pod \"apiserver-7bbb656c7d-zznkl\" (UID: \"e222e7a0-549c-46a7-8ee6-484dd2160be4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.446126 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d1e24d59-bf58-421f-81a7-cc04d151fdd5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gbwxf\" (UID: \"d1e24d59-bf58-421f-81a7-cc04d151fdd5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gbwxf" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.446468 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a67741d8-5ced-477f-ac4c-0f7fc736f363-serving-cert\") pod \"authentication-operator-69f744f599-bjx8c\" (UID: \"a67741d8-5ced-477f-ac4c-0f7fc736f363\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjx8c" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.447107 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bsgpp"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.448457 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jfhjj"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.450165 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.450211 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gd9xh"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.452373 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6sv2z"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.452669 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.452883 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.453161 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.453435 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.454169 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hlw9w"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.454351 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.456241 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-2zlqs"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.457786 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2rp65"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.459471 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-49gfv"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.460608 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zz6kk"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.461901 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-phw2k"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.463044 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t8g8d"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.464067 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvdhs"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.465140 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ldf7s"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.466459 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566515-6hw5x"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.467788 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-8q7f7"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.468786 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-7f66l"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.469018 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.469249 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7f66l" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.469892 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-8q7f7"] Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.470691 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.491332 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.511374 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.522520 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214938de-72be-4416-84ab-c12591ef2c68-config\") pod \"console-operator-58897d9998-bqpst\" (UID: \"214938de-72be-4416-84ab-c12591ef2c68\") " pod="openshift-console-operator/console-operator-58897d9998-bqpst" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.522550 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/16e1bdb5-47b7-40e0-bc4b-cdd87976f461-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hlw9w\" (UID: \"16e1bdb5-47b7-40e0-bc4b-cdd87976f461\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hlw9w" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.522578 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/214938de-72be-4416-84ab-c12591ef2c68-trusted-ca\") pod \"console-operator-58897d9998-bqpst\" (UID: \"214938de-72be-4416-84ab-c12591ef2c68\") " pod="openshift-console-operator/console-operator-58897d9998-bqpst" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.522595 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/954f5aa5-a05b-44bf-8642-e58746d21984-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-blksh\" (UID: \"954f5aa5-a05b-44bf-8642-e58746d21984\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-blksh" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.522614 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj2nb\" (UniqueName: \"kubernetes.io/projected/e2a05065-734d-4884-b037-c54ab87609eb-kube-api-access-qj2nb\") pod \"dns-operator-744455d44c-lzpsv\" (UID: \"e2a05065-734d-4884-b037-c54ab87609eb\") " pod="openshift-dns-operator/dns-operator-744455d44c-lzpsv" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.522643 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b32a129b-0c90-4d06-87d5-fd7e70b726e5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nv7hv\" (UID: \"b32a129b-0c90-4d06-87d5-fd7e70b726e5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nv7hv" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.522699 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/214938de-72be-4416-84ab-c12591ef2c68-serving-cert\") pod \"console-operator-58897d9998-bqpst\" (UID: \"214938de-72be-4416-84ab-c12591ef2c68\") " pod="openshift-console-operator/console-operator-58897d9998-bqpst" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.522738 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2lrx\" (UniqueName: \"kubernetes.io/projected/b32a129b-0c90-4d06-87d5-fd7e70b726e5-kube-api-access-c2lrx\") pod \"machine-config-controller-84d6567774-nv7hv\" (UID: \"b32a129b-0c90-4d06-87d5-fd7e70b726e5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nv7hv" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.522761 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/da30337a-26c0-4b0b-beb5-c46c48facfc6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-6sv2z\" (UID: \"da30337a-26c0-4b0b-beb5-c46c48facfc6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6sv2z" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.522779 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2a05065-734d-4884-b037-c54ab87609eb-metrics-tls\") pod \"dns-operator-744455d44c-lzpsv\" (UID: \"e2a05065-734d-4884-b037-c54ab87609eb\") " pod="openshift-dns-operator/dns-operator-744455d44c-lzpsv" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.522796 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/954f5aa5-a05b-44bf-8642-e58746d21984-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-blksh\" (UID: \"954f5aa5-a05b-44bf-8642-e58746d21984\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-blksh" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.522822 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs7s9\" (UniqueName: \"kubernetes.io/projected/da30337a-26c0-4b0b-beb5-c46c48facfc6-kube-api-access-qs7s9\") pod \"control-plane-machine-set-operator-78cbb6b69f-6sv2z\" (UID: \"da30337a-26c0-4b0b-beb5-c46c48facfc6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6sv2z" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.522838 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b32a129b-0c90-4d06-87d5-fd7e70b726e5-proxy-tls\") pod \"machine-config-controller-84d6567774-nv7hv\" (UID: \"b32a129b-0c90-4d06-87d5-fd7e70b726e5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nv7hv" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.522862 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p847c\" (UniqueName: \"kubernetes.io/projected/16e1bdb5-47b7-40e0-bc4b-cdd87976f461-kube-api-access-p847c\") pod \"multus-admission-controller-857f4d67dd-hlw9w\" (UID: \"16e1bdb5-47b7-40e0-bc4b-cdd87976f461\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hlw9w" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.522877 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/954f5aa5-a05b-44bf-8642-e58746d21984-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-blksh\" (UID: \"954f5aa5-a05b-44bf-8642-e58746d21984\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-blksh" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.522891 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s4gj\" (UniqueName: \"kubernetes.io/projected/214938de-72be-4416-84ab-c12591ef2c68-kube-api-access-9s4gj\") pod \"console-operator-58897d9998-bqpst\" (UID: \"214938de-72be-4416-84ab-c12591ef2c68\") " pod="openshift-console-operator/console-operator-58897d9998-bqpst" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.523665 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/954f5aa5-a05b-44bf-8642-e58746d21984-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-blksh\" (UID: \"954f5aa5-a05b-44bf-8642-e58746d21984\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-blksh" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.524472 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b32a129b-0c90-4d06-87d5-fd7e70b726e5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nv7hv\" (UID: \"b32a129b-0c90-4d06-87d5-fd7e70b726e5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nv7hv" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.526065 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/954f5aa5-a05b-44bf-8642-e58746d21984-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-blksh\" (UID: \"954f5aa5-a05b-44bf-8642-e58746d21984\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-blksh" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.530910 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.551088 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.572214 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.611046 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.631048 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.651507 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.678867 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.691417 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.711558 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.718024 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b32a129b-0c90-4d06-87d5-fd7e70b726e5-proxy-tls\") pod \"machine-config-controller-84d6567774-nv7hv\" (UID: \"b32a129b-0c90-4d06-87d5-fd7e70b726e5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nv7hv" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.731188 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.751079 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.771516 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.791049 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.813022 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.832000 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.852356 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.871779 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.892176 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.911570 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.932470 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.952639 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.971270 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.991950 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 07:16:07 crc kubenswrapper[4749]: I0320 07:16:07.997539 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/214938de-72be-4416-84ab-c12591ef2c68-serving-cert\") pod \"console-operator-58897d9998-bqpst\" (UID: \"214938de-72be-4416-84ab-c12591ef2c68\") " pod="openshift-console-operator/console-operator-58897d9998-bqpst" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.012838 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.032312 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.034522 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214938de-72be-4416-84ab-c12591ef2c68-config\") pod \"console-operator-58897d9998-bqpst\" (UID: \"214938de-72be-4416-84ab-c12591ef2c68\") " pod="openshift-console-operator/console-operator-58897d9998-bqpst" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.061981 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.065551 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/214938de-72be-4416-84ab-c12591ef2c68-trusted-ca\") pod \"console-operator-58897d9998-bqpst\" (UID: \"214938de-72be-4416-84ab-c12591ef2c68\") " pod="openshift-console-operator/console-operator-58897d9998-bqpst" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.072441 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.092007 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.112473 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.131087 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.151568 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.174505 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.193009 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.210981 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.231594 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.252694 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.271567 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.292039 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.312043 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.318964 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2a05065-734d-4884-b037-c54ab87609eb-metrics-tls\") pod \"dns-operator-744455d44c-lzpsv\" (UID: \"e2a05065-734d-4884-b037-c54ab87609eb\") " pod="openshift-dns-operator/dns-operator-744455d44c-lzpsv" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.331800 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.349972 4749 request.go:700] Waited for 1.012763401s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpprof-cert&limit=500&resourceVersion=0 Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.352135 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.372088 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.391497 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.411735 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.417685 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/16e1bdb5-47b7-40e0-bc4b-cdd87976f461-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hlw9w\" (UID: \"16e1bdb5-47b7-40e0-bc4b-cdd87976f461\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hlw9w" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.432126 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.452923 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.472983 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.492036 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.511957 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 07:16:08 crc kubenswrapper[4749]: E0320 07:16:08.523650 4749 secret.go:188] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 07:16:08 crc kubenswrapper[4749]: E0320 07:16:08.523767 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da30337a-26c0-4b0b-beb5-c46c48facfc6-control-plane-machine-set-operator-tls podName:da30337a-26c0-4b0b-beb5-c46c48facfc6 nodeName:}" failed. No retries permitted until 2026-03-20 07:16:09.02374241 +0000 UTC m=+205.573400057 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/da30337a-26c0-4b0b-beb5-c46c48facfc6-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-78cbb6b69f-6sv2z" (UID: "da30337a-26c0-4b0b-beb5-c46c48facfc6") : failed to sync secret cache: timed out waiting for the condition Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.531028 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.551931 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.571667 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.591611 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.612382 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.631421 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.651991 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.691892 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.720313 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.730877 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.751161 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.771537 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.791457 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.812011 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.832807 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.851269 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.872343 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.892509 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.911881 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.931556 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.951771 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.971904 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 07:16:08 crc kubenswrapper[4749]: I0320 07:16:08.992373 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.012035 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.031649 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.042384 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/da30337a-26c0-4b0b-beb5-c46c48facfc6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-6sv2z\" (UID: \"da30337a-26c0-4b0b-beb5-c46c48facfc6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6sv2z" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.048704 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/da30337a-26c0-4b0b-beb5-c46c48facfc6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-6sv2z\" (UID: \"da30337a-26c0-4b0b-beb5-c46c48facfc6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6sv2z" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.052457 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.072175 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.092189 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.112277 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.159357 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7442t\" (UniqueName: \"kubernetes.io/projected/d999d3d0-14e4-4759-98ab-a6d11011ca86-kube-api-access-7442t\") pod \"apiserver-76f77b778f-9gg28\" (UID: \"d999d3d0-14e4-4759-98ab-a6d11011ca86\") " pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.164700 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.189618 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p24s\" (UniqueName: \"kubernetes.io/projected/815dbd4c-68ea-43e3-a355-1658ccdccd22-kube-api-access-7p24s\") pod \"cluster-image-registry-operator-dc59b4c8b-z5lbj\" (UID: \"815dbd4c-68ea-43e3-a355-1658ccdccd22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z5lbj" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.195686 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrp6c\" (UniqueName: \"kubernetes.io/projected/309f1b8f-63a4-4019-b8f0-500dc7b60c8d-kube-api-access-mrp6c\") pod \"openshift-config-operator-7777fb866f-q5gk7\" (UID: \"309f1b8f-63a4-4019-b8f0-500dc7b60c8d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5gk7" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.227032 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-742n8\" (UniqueName: \"kubernetes.io/projected/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899-kube-api-access-742n8\") pod \"route-controller-manager-6576b87f9c-jvtqf\" (UID: \"4f9b9110-5f21-4d71-ac4e-61e0ff6b1899\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.240925 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79q4p\" (UniqueName: \"kubernetes.io/projected/e222e7a0-549c-46a7-8ee6-484dd2160be4-kube-api-access-79q4p\") pod \"apiserver-7bbb656c7d-zznkl\" (UID: \"e222e7a0-549c-46a7-8ee6-484dd2160be4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.248408 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-przch\" (UniqueName: \"kubernetes.io/projected/95e4555b-7f8b-4297-bed6-e0cf5e90ea3e-kube-api-access-przch\") pod \"downloads-7954f5f757-952x2\" (UID: \"95e4555b-7f8b-4297-bed6-e0cf5e90ea3e\") " pod="openshift-console/downloads-7954f5f757-952x2" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.268802 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrjz4\" (UniqueName: \"kubernetes.io/projected/bd1003fd-4300-423c-b500-e782a8aeb7bb-kube-api-access-wrjz4\") pod \"controller-manager-879f6c89f-hvf29\" (UID: \"bd1003fd-4300-423c-b500-e782a8aeb7bb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.290772 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8sp2\" (UniqueName: \"kubernetes.io/projected/d1e24d59-bf58-421f-81a7-cc04d151fdd5-kube-api-access-b8sp2\") pod \"cluster-samples-operator-665b6dd947-gbwxf\" (UID: \"d1e24d59-bf58-421f-81a7-cc04d151fdd5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gbwxf" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.319337 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gznf6\" (UniqueName: \"kubernetes.io/projected/38b3f23d-6db5-4788-bcd5-810450677cd6-kube-api-access-gznf6\") pod \"oauth-openshift-558db77b4-t5b5l\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.328982 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/815dbd4c-68ea-43e3-a355-1658ccdccd22-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-z5lbj\" (UID: \"815dbd4c-68ea-43e3-a355-1658ccdccd22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z5lbj" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.348546 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zbmn\" (UniqueName: \"kubernetes.io/projected/a67741d8-5ced-477f-ac4c-0f7fc736f363-kube-api-access-4zbmn\") pod \"authentication-operator-69f744f599-bjx8c\" (UID: \"a67741d8-5ced-477f-ac4c-0f7fc736f363\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjx8c" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.351831 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.358117 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5gk7" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.370439 4749 request.go:700] Waited for 1.926800666s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-dockercfg-jwfmh&limit=500&resourceVersion=0 Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.373182 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.390192 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.392779 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.411128 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.413388 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-9gg28"] Mar 20 07:16:09 crc kubenswrapper[4749]: W0320 07:16:09.420594 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd999d3d0_14e4_4759_98ab_a6d11011ca86.slice/crio-65a8dffdebc8cf32de98d8d29e2137e9c25c107deccc9bb957255fb8f13ceec7 WatchSource:0}: Error finding container 65a8dffdebc8cf32de98d8d29e2137e9c25c107deccc9bb957255fb8f13ceec7: Status 404 returned error can't find the container with id 65a8dffdebc8cf32de98d8d29e2137e9c25c107deccc9bb957255fb8f13ceec7 Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.429662 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z5lbj" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.431944 4749 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.443553 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.451839 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.472131 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.477828 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gbwxf" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.486667 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.491770 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.493998 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bjx8c" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.503816 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.512145 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.525261 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-952x2" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.546824 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj2nb\" (UniqueName: \"kubernetes.io/projected/e2a05065-734d-4884-b037-c54ab87609eb-kube-api-access-qj2nb\") pod \"dns-operator-744455d44c-lzpsv\" (UID: \"e2a05065-734d-4884-b037-c54ab87609eb\") " pod="openshift-dns-operator/dns-operator-744455d44c-lzpsv" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.554004 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-q5gk7"] Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.568682 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2lrx\" (UniqueName: \"kubernetes.io/projected/b32a129b-0c90-4d06-87d5-fd7e70b726e5-kube-api-access-c2lrx\") pod \"machine-config-controller-84d6567774-nv7hv\" (UID: \"b32a129b-0c90-4d06-87d5-fd7e70b726e5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nv7hv" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.596384 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s4gj\" (UniqueName: \"kubernetes.io/projected/214938de-72be-4416-84ab-c12591ef2c68-kube-api-access-9s4gj\") pod \"console-operator-58897d9998-bqpst\" (UID: \"214938de-72be-4416-84ab-c12591ef2c68\") " pod="openshift-console-operator/console-operator-58897d9998-bqpst" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.601512 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf"] Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.608816 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/954f5aa5-a05b-44bf-8642-e58746d21984-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-blksh\" (UID: \"954f5aa5-a05b-44bf-8642-e58746d21984\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-blksh" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.623348 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-blksh" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.633692 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p847c\" (UniqueName: \"kubernetes.io/projected/16e1bdb5-47b7-40e0-bc4b-cdd87976f461-kube-api-access-p847c\") pod \"multus-admission-controller-857f4d67dd-hlw9w\" (UID: \"16e1bdb5-47b7-40e0-bc4b-cdd87976f461\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hlw9w" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.647535 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nv7hv" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.651665 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs7s9\" (UniqueName: \"kubernetes.io/projected/da30337a-26c0-4b0b-beb5-c46c48facfc6-kube-api-access-qs7s9\") pod \"control-plane-machine-set-operator-78cbb6b69f-6sv2z\" (UID: \"da30337a-26c0-4b0b-beb5-c46c48facfc6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6sv2z" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.675887 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z5lbj"] Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.676006 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bqpst" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.724972 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-lzpsv" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.740689 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hlw9w" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.747804 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gbwxf"] Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.753223 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/473085e8-ee17-4244-abd0-dcf2308b4655-registry-certificates\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.753265 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7xkd\" (UniqueName: \"kubernetes.io/projected/da95cd86-f90a-4d7f-a308-4124b22d8427-kube-api-access-r7xkd\") pod \"auto-csr-approver-29566516-6stbk\" (UID: \"da95cd86-f90a-4d7f-a308-4124b22d8427\") " pod="openshift-infra/auto-csr-approver-29566516-6stbk" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.753337 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/473085e8-ee17-4244-abd0-dcf2308b4655-trusted-ca\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.753366 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/473085e8-ee17-4244-abd0-dcf2308b4655-bound-sa-token\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.753399 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/473085e8-ee17-4244-abd0-dcf2308b4655-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.753455 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dce27735-9fe1-49f2-a05a-c042a4b6db32-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bbrk6\" (UID: \"dce27735-9fe1-49f2-a05a-c042a4b6db32\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbrk6" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.753481 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00730545-e9b7-4166-9f09-7a6fcac8cad3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j79zb\" (UID: \"00730545-e9b7-4166-9f09-7a6fcac8cad3\") " pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.753615 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c03762c2-c2af-4472-abb5-5017f75e738f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lwsmc\" (UID: \"c03762c2-c2af-4472-abb5-5017f75e738f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lwsmc" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.753646 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d-config\") pod \"machine-approver-56656f9798-64rv4\" (UID: \"6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-64rv4" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.754204 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xmkj\" (UniqueName: \"kubernetes.io/projected/473085e8-ee17-4244-abd0-dcf2308b4655-kube-api-access-8xmkj\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.754472 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-console-config\") pod \"console-f9d7485db-2zlqs\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.754494 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c03762c2-c2af-4472-abb5-5017f75e738f-config\") pod \"kube-controller-manager-operator-78b949d7b-lwsmc\" (UID: \"c03762c2-c2af-4472-abb5-5017f75e738f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lwsmc" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.754607 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkhhz\" (UniqueName: \"kubernetes.io/projected/b75a7366-ff7f-4176-80f2-687c82069d70-kube-api-access-pkhhz\") pod \"migrator-59844c95c7-s629z\" (UID: \"b75a7366-ff7f-4176-80f2-687c82069d70\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s629z" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.754633 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1b74013-b7b2-4e6b-a227-3161015b1d80-proxy-tls\") pod \"machine-config-operator-74547568cd-2rp65\" (UID: \"b1b74013-b7b2-4e6b-a227-3161015b1d80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2rp65" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.754742 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6799ceeb-28d4-4caf-97e4-e9115baae071-etcd-ca\") pod \"etcd-operator-b45778765-vmnvn\" (UID: \"6799ceeb-28d4-4caf-97e4-e9115baae071\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vmnvn" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.754766 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1f707ea9-be11-4354-99a3-e439dd4e6173-webhook-cert\") pod \"packageserver-d55dfcdfc-gd9xh\" (UID: \"1f707ea9-be11-4354-99a3-e439dd4e6173\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gd9xh" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.754785 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/88f1260c-ea7b-4282-b1af-a7f738cf40b9-profile-collector-cert\") pod \"catalog-operator-68c6474976-n2fsv\" (UID: \"88f1260c-ea7b-4282-b1af-a7f738cf40b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n2fsv" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.754814 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6799ceeb-28d4-4caf-97e4-e9115baae071-etcd-service-ca\") pod \"etcd-operator-b45778765-vmnvn\" (UID: \"6799ceeb-28d4-4caf-97e4-e9115baae071\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vmnvn" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.754907 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/da62d543-787a-4364-8271-8f8f9529dd0c-default-certificate\") pod \"router-default-5444994796-dbs7t\" (UID: \"da62d543-787a-4364-8271-8f8f9529dd0c\") " pod="openshift-ingress/router-default-5444994796-dbs7t" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755020 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da62d543-787a-4364-8271-8f8f9529dd0c-service-ca-bundle\") pod \"router-default-5444994796-dbs7t\" (UID: \"da62d543-787a-4364-8271-8f8f9529dd0c\") " pod="openshift-ingress/router-default-5444994796-dbs7t" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755072 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/00730545-e9b7-4166-9f09-7a6fcac8cad3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j79zb\" (UID: \"00730545-e9b7-4166-9f09-7a6fcac8cad3\") " pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755138 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/657acde5-fe52-4eaa-812c-00914daf93ba-config\") pod \"kube-apiserver-operator-766d6c64bb-c9ncz\" (UID: \"657acde5-fe52-4eaa-812c-00914daf93ba\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9ncz" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755165 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgkxj\" (UniqueName: \"kubernetes.io/projected/6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d-kube-api-access-hgkxj\") pod \"machine-approver-56656f9798-64rv4\" (UID: \"6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-64rv4" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755339 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-trusted-ca-bundle\") pod \"console-f9d7485db-2zlqs\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755410 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/473085e8-ee17-4244-abd0-dcf2308b4655-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755464 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnwj5\" (UniqueName: \"kubernetes.io/projected/b1b74013-b7b2-4e6b-a227-3161015b1d80-kube-api-access-mnwj5\") pod \"machine-config-operator-74547568cd-2rp65\" (UID: \"b1b74013-b7b2-4e6b-a227-3161015b1d80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2rp65" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755489 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dce27735-9fe1-49f2-a05a-c042a4b6db32-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bbrk6\" (UID: \"dce27735-9fe1-49f2-a05a-c042a4b6db32\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbrk6" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755510 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a758ef11-ae4c-4d21-96b4-0a8bded670a3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bsgpp\" (UID: \"a758ef11-ae4c-4d21-96b4-0a8bded670a3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bsgpp" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755526 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6799ceeb-28d4-4caf-97e4-e9115baae071-config\") pod \"etcd-operator-b45778765-vmnvn\" (UID: \"6799ceeb-28d4-4caf-97e4-e9115baae071\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vmnvn" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755569 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da62d543-787a-4364-8271-8f8f9529dd0c-metrics-certs\") pod \"router-default-5444994796-dbs7t\" (UID: \"da62d543-787a-4364-8271-8f8f9529dd0c\") " pod="openshift-ingress/router-default-5444994796-dbs7t" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755627 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-console-serving-cert\") pod \"console-f9d7485db-2zlqs\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755644 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44vbg\" (UniqueName: \"kubernetes.io/projected/a758ef11-ae4c-4d21-96b4-0a8bded670a3-kube-api-access-44vbg\") pod \"openshift-apiserver-operator-796bbdcf4f-bsgpp\" (UID: \"a758ef11-ae4c-4d21-96b4-0a8bded670a3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bsgpp" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755659 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e8fb037-3e85-4c5a-a782-857cb17429af-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ldf7s\" (UID: \"2e8fb037-3e85-4c5a-a782-857cb17429af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ldf7s" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755706 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-console-oauth-config\") pod \"console-f9d7485db-2zlqs\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755751 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnf9s\" (UniqueName: \"kubernetes.io/projected/16c3a232-5504-4648-a65b-2a0d89126e22-kube-api-access-lnf9s\") pod \"ingress-operator-5b745b69d9-jbqrm\" (UID: \"16c3a232-5504-4648-a65b-2a0d89126e22\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jbqrm" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755772 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b1b74013-b7b2-4e6b-a227-3161015b1d80-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2rp65\" (UID: \"b1b74013-b7b2-4e6b-a227-3161015b1d80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2rp65" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755788 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wk7m\" (UniqueName: \"kubernetes.io/projected/da62d543-787a-4364-8271-8f8f9529dd0c-kube-api-access-9wk7m\") pod \"router-default-5444994796-dbs7t\" (UID: \"da62d543-787a-4364-8271-8f8f9529dd0c\") " pod="openshift-ingress/router-default-5444994796-dbs7t" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755803 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbnzl\" (UniqueName: \"kubernetes.io/projected/dce27735-9fe1-49f2-a05a-c042a4b6db32-kube-api-access-lbnzl\") pod \"openshift-controller-manager-operator-756b6f6bc6-bbrk6\" (UID: \"dce27735-9fe1-49f2-a05a-c042a4b6db32\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbrk6" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755822 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/657acde5-fe52-4eaa-812c-00914daf93ba-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-c9ncz\" (UID: \"657acde5-fe52-4eaa-812c-00914daf93ba\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9ncz" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755837 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs7lf\" (UniqueName: \"kubernetes.io/projected/2e8fb037-3e85-4c5a-a782-857cb17429af-kube-api-access-fs7lf\") pod \"kube-storage-version-migrator-operator-b67b599dd-ldf7s\" (UID: \"2e8fb037-3e85-4c5a-a782-857cb17429af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ldf7s" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755855 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-oauth-serving-cert\") pod \"console-f9d7485db-2zlqs\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755882 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkkg8\" (UniqueName: \"kubernetes.io/projected/6799ceeb-28d4-4caf-97e4-e9115baae071-kube-api-access-fkkg8\") pod \"etcd-operator-b45778765-vmnvn\" (UID: \"6799ceeb-28d4-4caf-97e4-e9115baae071\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vmnvn" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755932 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnxph\" (UniqueName: \"kubernetes.io/projected/00730545-e9b7-4166-9f09-7a6fcac8cad3-kube-api-access-fnxph\") pod \"marketplace-operator-79b997595-j79zb\" (UID: \"00730545-e9b7-4166-9f09-7a6fcac8cad3\") " pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755952 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1f707ea9-be11-4354-99a3-e439dd4e6173-tmpfs\") pod \"packageserver-d55dfcdfc-gd9xh\" (UID: \"1f707ea9-be11-4354-99a3-e439dd4e6173\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gd9xh" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755968 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1f707ea9-be11-4354-99a3-e439dd4e6173-apiservice-cert\") pod \"packageserver-d55dfcdfc-gd9xh\" (UID: \"1f707ea9-be11-4354-99a3-e439dd4e6173\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gd9xh" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.755985 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16c3a232-5504-4648-a65b-2a0d89126e22-trusted-ca\") pod \"ingress-operator-5b745b69d9-jbqrm\" (UID: \"16c3a232-5504-4648-a65b-2a0d89126e22\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jbqrm" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.756017 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx7wr\" (UniqueName: \"kubernetes.io/projected/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-kube-api-access-cx7wr\") pod \"console-f9d7485db-2zlqs\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.756033 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d-auth-proxy-config\") pod \"machine-approver-56656f9798-64rv4\" (UID: \"6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-64rv4" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.756049 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc5p9\" (UniqueName: \"kubernetes.io/projected/1f707ea9-be11-4354-99a3-e439dd4e6173-kube-api-access-hc5p9\") pod \"packageserver-d55dfcdfc-gd9xh\" (UID: \"1f707ea9-be11-4354-99a3-e439dd4e6173\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gd9xh" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.756067 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-service-ca\") pod \"console-f9d7485db-2zlqs\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.756093 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/473085e8-ee17-4244-abd0-dcf2308b4655-registry-tls\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.756134 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a758ef11-ae4c-4d21-96b4-0a8bded670a3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bsgpp\" (UID: \"a758ef11-ae4c-4d21-96b4-0a8bded670a3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bsgpp" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.756154 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16c3a232-5504-4648-a65b-2a0d89126e22-metrics-tls\") pod \"ingress-operator-5b745b69d9-jbqrm\" (UID: \"16c3a232-5504-4648-a65b-2a0d89126e22\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jbqrm" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.756168 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/657acde5-fe52-4eaa-812c-00914daf93ba-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-c9ncz\" (UID: \"657acde5-fe52-4eaa-812c-00914daf93ba\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9ncz" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.756193 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16c3a232-5504-4648-a65b-2a0d89126e22-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jbqrm\" (UID: \"16c3a232-5504-4648-a65b-2a0d89126e22\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jbqrm" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.756220 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/da62d543-787a-4364-8271-8f8f9529dd0c-stats-auth\") pod \"router-default-5444994796-dbs7t\" (UID: \"da62d543-787a-4364-8271-8f8f9529dd0c\") " pod="openshift-ingress/router-default-5444994796-dbs7t" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.756234 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6799ceeb-28d4-4caf-97e4-e9115baae071-serving-cert\") pod \"etcd-operator-b45778765-vmnvn\" (UID: \"6799ceeb-28d4-4caf-97e4-e9115baae071\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vmnvn" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.756259 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c03762c2-c2af-4472-abb5-5017f75e738f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lwsmc\" (UID: \"c03762c2-c2af-4472-abb5-5017f75e738f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lwsmc" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.756274 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6799ceeb-28d4-4caf-97e4-e9115baae071-etcd-client\") pod \"etcd-operator-b45778765-vmnvn\" (UID: \"6799ceeb-28d4-4caf-97e4-e9115baae071\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vmnvn" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.756329 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b1b74013-b7b2-4e6b-a227-3161015b1d80-images\") pod \"machine-config-operator-74547568cd-2rp65\" (UID: \"b1b74013-b7b2-4e6b-a227-3161015b1d80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2rp65" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.756368 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d-machine-approver-tls\") pod \"machine-approver-56656f9798-64rv4\" (UID: \"6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-64rv4" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.756389 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.756414 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/88f1260c-ea7b-4282-b1af-a7f738cf40b9-srv-cert\") pod \"catalog-operator-68c6474976-n2fsv\" (UID: \"88f1260c-ea7b-4282-b1af-a7f738cf40b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n2fsv" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.756427 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgsdt\" (UniqueName: \"kubernetes.io/projected/88f1260c-ea7b-4282-b1af-a7f738cf40b9-kube-api-access-sgsdt\") pod \"catalog-operator-68c6474976-n2fsv\" (UID: \"88f1260c-ea7b-4282-b1af-a7f738cf40b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n2fsv" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.756442 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e8fb037-3e85-4c5a-a782-857cb17429af-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ldf7s\" (UID: \"2e8fb037-3e85-4c5a-a782-857cb17429af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ldf7s" Mar 20 07:16:09 crc kubenswrapper[4749]: E0320 07:16:09.757996 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:10.257984434 +0000 UTC m=+206.807642081 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.767550 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6sv2z" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.829277 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-t5b5l"] Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.856837 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857052 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/00730545-e9b7-4166-9f09-7a6fcac8cad3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j79zb\" (UID: \"00730545-e9b7-4166-9f09-7a6fcac8cad3\") " pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857081 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/928b2eb3-aeb6-411f-b215-a33551894e85-certs\") pod \"machine-config-server-7f66l\" (UID: \"928b2eb3-aeb6-411f-b215-a33551894e85\") " pod="openshift-machine-config-operator/machine-config-server-7f66l" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857101 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/16b05ee8-fdde-4f11-936e-0982042ccfcf-registration-dir\") pod \"csi-hostpathplugin-8q7f7\" (UID: \"16b05ee8-fdde-4f11-936e-0982042ccfcf\") " pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857120 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/657acde5-fe52-4eaa-812c-00914daf93ba-config\") pod \"kube-apiserver-operator-766d6c64bb-c9ncz\" (UID: \"657acde5-fe52-4eaa-812c-00914daf93ba\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9ncz" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857136 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgkxj\" (UniqueName: \"kubernetes.io/projected/6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d-kube-api-access-hgkxj\") pod \"machine-approver-56656f9798-64rv4\" (UID: \"6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-64rv4" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857151 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-trusted-ca-bundle\") pod \"console-f9d7485db-2zlqs\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857182 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdg2k\" (UniqueName: \"kubernetes.io/projected/928b2eb3-aeb6-411f-b215-a33551894e85-kube-api-access-jdg2k\") pod \"machine-config-server-7f66l\" (UID: \"928b2eb3-aeb6-411f-b215-a33551894e85\") " pod="openshift-machine-config-operator/machine-config-server-7f66l" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857198 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/473085e8-ee17-4244-abd0-dcf2308b4655-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857215 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnwj5\" (UniqueName: \"kubernetes.io/projected/b1b74013-b7b2-4e6b-a227-3161015b1d80-kube-api-access-mnwj5\") pod \"machine-config-operator-74547568cd-2rp65\" (UID: \"b1b74013-b7b2-4e6b-a227-3161015b1d80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2rp65" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857230 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dce27735-9fe1-49f2-a05a-c042a4b6db32-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bbrk6\" (UID: \"dce27735-9fe1-49f2-a05a-c042a4b6db32\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbrk6" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857246 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a758ef11-ae4c-4d21-96b4-0a8bded670a3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bsgpp\" (UID: \"a758ef11-ae4c-4d21-96b4-0a8bded670a3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bsgpp" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857325 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6799ceeb-28d4-4caf-97e4-e9115baae071-config\") pod \"etcd-operator-b45778765-vmnvn\" (UID: \"6799ceeb-28d4-4caf-97e4-e9115baae071\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vmnvn" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857342 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da62d543-787a-4364-8271-8f8f9529dd0c-metrics-certs\") pod \"router-default-5444994796-dbs7t\" (UID: \"da62d543-787a-4364-8271-8f8f9529dd0c\") " pod="openshift-ingress/router-default-5444994796-dbs7t" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857358 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ecc1f279-eced-4b51-8ded-b7d00d089722-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-phw2k\" (UID: \"ecc1f279-eced-4b51-8ded-b7d00d089722\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-phw2k" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857373 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ecc1f279-eced-4b51-8ded-b7d00d089722-images\") pod \"machine-api-operator-5694c8668f-phw2k\" (UID: \"ecc1f279-eced-4b51-8ded-b7d00d089722\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-phw2k" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857389 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpgj5\" (UniqueName: \"kubernetes.io/projected/eac86025-7f7e-49a8-ac1b-4bf8c1a65c35-kube-api-access-tpgj5\") pod \"service-ca-9c57cc56f-jfhjj\" (UID: \"eac86025-7f7e-49a8-ac1b-4bf8c1a65c35\") " pod="openshift-service-ca/service-ca-9c57cc56f-jfhjj" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857452 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-console-serving-cert\") pod \"console-f9d7485db-2zlqs\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857469 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44vbg\" (UniqueName: \"kubernetes.io/projected/a758ef11-ae4c-4d21-96b4-0a8bded670a3-kube-api-access-44vbg\") pod \"openshift-apiserver-operator-796bbdcf4f-bsgpp\" (UID: \"a758ef11-ae4c-4d21-96b4-0a8bded670a3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bsgpp" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857485 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e8fb037-3e85-4c5a-a782-857cb17429af-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ldf7s\" (UID: \"2e8fb037-3e85-4c5a-a782-857cb17429af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ldf7s" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857516 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-console-oauth-config\") pod \"console-f9d7485db-2zlqs\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857531 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnf9s\" (UniqueName: \"kubernetes.io/projected/16c3a232-5504-4648-a65b-2a0d89126e22-kube-api-access-lnf9s\") pod \"ingress-operator-5b745b69d9-jbqrm\" (UID: \"16c3a232-5504-4648-a65b-2a0d89126e22\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jbqrm" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857550 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b1b74013-b7b2-4e6b-a227-3161015b1d80-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2rp65\" (UID: \"b1b74013-b7b2-4e6b-a227-3161015b1d80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2rp65" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857576 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wk7m\" (UniqueName: \"kubernetes.io/projected/da62d543-787a-4364-8271-8f8f9529dd0c-kube-api-access-9wk7m\") pod \"router-default-5444994796-dbs7t\" (UID: \"da62d543-787a-4364-8271-8f8f9529dd0c\") " pod="openshift-ingress/router-default-5444994796-dbs7t" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857592 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbnzl\" (UniqueName: \"kubernetes.io/projected/dce27735-9fe1-49f2-a05a-c042a4b6db32-kube-api-access-lbnzl\") pod \"openshift-controller-manager-operator-756b6f6bc6-bbrk6\" (UID: \"dce27735-9fe1-49f2-a05a-c042a4b6db32\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbrk6" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857628 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/657acde5-fe52-4eaa-812c-00914daf93ba-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-c9ncz\" (UID: \"657acde5-fe52-4eaa-812c-00914daf93ba\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9ncz" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857645 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs7lf\" (UniqueName: \"kubernetes.io/projected/2e8fb037-3e85-4c5a-a782-857cb17429af-kube-api-access-fs7lf\") pod \"kube-storage-version-migrator-operator-b67b599dd-ldf7s\" (UID: \"2e8fb037-3e85-4c5a-a782-857cb17429af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ldf7s" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857669 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-oauth-serving-cert\") pod \"console-f9d7485db-2zlqs\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857692 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkkg8\" (UniqueName: \"kubernetes.io/projected/6799ceeb-28d4-4caf-97e4-e9115baae071-kube-api-access-fkkg8\") pod \"etcd-operator-b45778765-vmnvn\" (UID: \"6799ceeb-28d4-4caf-97e4-e9115baae071\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vmnvn" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857709 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/928b2eb3-aeb6-411f-b215-a33551894e85-node-bootstrap-token\") pod \"machine-config-server-7f66l\" (UID: \"928b2eb3-aeb6-411f-b215-a33551894e85\") " pod="openshift-machine-config-operator/machine-config-server-7f66l" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857726 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnxph\" (UniqueName: \"kubernetes.io/projected/00730545-e9b7-4166-9f09-7a6fcac8cad3-kube-api-access-fnxph\") pod \"marketplace-operator-79b997595-j79zb\" (UID: \"00730545-e9b7-4166-9f09-7a6fcac8cad3\") " pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857742 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1f707ea9-be11-4354-99a3-e439dd4e6173-apiservice-cert\") pod \"packageserver-d55dfcdfc-gd9xh\" (UID: \"1f707ea9-be11-4354-99a3-e439dd4e6173\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gd9xh" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857758 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/eac86025-7f7e-49a8-ac1b-4bf8c1a65c35-signing-cabundle\") pod \"service-ca-9c57cc56f-jfhjj\" (UID: \"eac86025-7f7e-49a8-ac1b-4bf8c1a65c35\") " pod="openshift-service-ca/service-ca-9c57cc56f-jfhjj" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857791 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1f707ea9-be11-4354-99a3-e439dd4e6173-tmpfs\") pod \"packageserver-d55dfcdfc-gd9xh\" (UID: \"1f707ea9-be11-4354-99a3-e439dd4e6173\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gd9xh" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857815 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16c3a232-5504-4648-a65b-2a0d89126e22-trusted-ca\") pod \"ingress-operator-5b745b69d9-jbqrm\" (UID: \"16c3a232-5504-4648-a65b-2a0d89126e22\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jbqrm" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857831 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6b71bdcd-f324-489c-a3ae-61ac7648b36a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hvdhs\" (UID: \"6b71bdcd-f324-489c-a3ae-61ac7648b36a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvdhs" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857847 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6znw7\" (UniqueName: \"kubernetes.io/projected/d842a173-2088-46ee-bdd3-4f058a2c62e8-kube-api-access-6znw7\") pod \"ingress-canary-49gfv\" (UID: \"d842a173-2088-46ee-bdd3-4f058a2c62e8\") " pod="openshift-ingress-canary/ingress-canary-49gfv" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857863 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d441824b-dc11-4f89-af28-0a5c76439296-metrics-tls\") pod \"dns-default-t8g8d\" (UID: \"d441824b-dc11-4f89-af28-0a5c76439296\") " pod="openshift-dns/dns-default-t8g8d" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857908 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx7wr\" (UniqueName: \"kubernetes.io/projected/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-kube-api-access-cx7wr\") pod \"console-f9d7485db-2zlqs\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857923 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d-auth-proxy-config\") pod \"machine-approver-56656f9798-64rv4\" (UID: \"6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-64rv4" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857939 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc5p9\" (UniqueName: \"kubernetes.io/projected/1f707ea9-be11-4354-99a3-e439dd4e6173-kube-api-access-hc5p9\") pod \"packageserver-d55dfcdfc-gd9xh\" (UID: \"1f707ea9-be11-4354-99a3-e439dd4e6173\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gd9xh" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857955 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9qln\" (UniqueName: \"kubernetes.io/projected/16b05ee8-fdde-4f11-936e-0982042ccfcf-kube-api-access-f9qln\") pod \"csi-hostpathplugin-8q7f7\" (UID: \"16b05ee8-fdde-4f11-936e-0982042ccfcf\") " pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857972 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2af8695f-a945-411d-ac95-03191fb3080d-config-volume\") pod \"collect-profiles-29566515-6hw5x\" (UID: \"2af8695f-a945-411d-ac95-03191fb3080d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-6hw5x" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.857999 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-service-ca\") pod \"console-f9d7485db-2zlqs\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858025 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/473085e8-ee17-4244-abd0-dcf2308b4655-registry-tls\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858066 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a758ef11-ae4c-4d21-96b4-0a8bded670a3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bsgpp\" (UID: \"a758ef11-ae4c-4d21-96b4-0a8bded670a3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bsgpp" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858082 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/657acde5-fe52-4eaa-812c-00914daf93ba-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-c9ncz\" (UID: \"657acde5-fe52-4eaa-812c-00914daf93ba\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9ncz" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858097 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16c3a232-5504-4648-a65b-2a0d89126e22-metrics-tls\") pod \"ingress-operator-5b745b69d9-jbqrm\" (UID: \"16c3a232-5504-4648-a65b-2a0d89126e22\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jbqrm" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858122 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16c3a232-5504-4648-a65b-2a0d89126e22-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jbqrm\" (UID: \"16c3a232-5504-4648-a65b-2a0d89126e22\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jbqrm" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858145 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/da62d543-787a-4364-8271-8f8f9529dd0c-stats-auth\") pod \"router-default-5444994796-dbs7t\" (UID: \"da62d543-787a-4364-8271-8f8f9529dd0c\") " pod="openshift-ingress/router-default-5444994796-dbs7t" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858159 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6799ceeb-28d4-4caf-97e4-e9115baae071-serving-cert\") pod \"etcd-operator-b45778765-vmnvn\" (UID: \"6799ceeb-28d4-4caf-97e4-e9115baae071\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vmnvn" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858176 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0121f83c-494b-40f1-9a70-65344ed716ad-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5pwjp\" (UID: \"0121f83c-494b-40f1-9a70-65344ed716ad\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pwjp" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858204 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c03762c2-c2af-4472-abb5-5017f75e738f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lwsmc\" (UID: \"c03762c2-c2af-4472-abb5-5017f75e738f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lwsmc" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858219 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6799ceeb-28d4-4caf-97e4-e9115baae071-etcd-client\") pod \"etcd-operator-b45778765-vmnvn\" (UID: \"6799ceeb-28d4-4caf-97e4-e9115baae071\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vmnvn" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858243 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b1b74013-b7b2-4e6b-a227-3161015b1d80-images\") pod \"machine-config-operator-74547568cd-2rp65\" (UID: \"b1b74013-b7b2-4e6b-a227-3161015b1d80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2rp65" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858265 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/16b05ee8-fdde-4f11-936e-0982042ccfcf-csi-data-dir\") pod \"csi-hostpathplugin-8q7f7\" (UID: \"16b05ee8-fdde-4f11-936e-0982042ccfcf\") " pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858355 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d-machine-approver-tls\") pod \"machine-approver-56656f9798-64rv4\" (UID: \"6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-64rv4" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858377 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/88f1260c-ea7b-4282-b1af-a7f738cf40b9-srv-cert\") pod \"catalog-operator-68c6474976-n2fsv\" (UID: \"88f1260c-ea7b-4282-b1af-a7f738cf40b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n2fsv" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858394 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgsdt\" (UniqueName: \"kubernetes.io/projected/88f1260c-ea7b-4282-b1af-a7f738cf40b9-kube-api-access-sgsdt\") pod \"catalog-operator-68c6474976-n2fsv\" (UID: \"88f1260c-ea7b-4282-b1af-a7f738cf40b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n2fsv" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858410 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e8fb037-3e85-4c5a-a782-857cb17429af-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ldf7s\" (UID: \"2e8fb037-3e85-4c5a-a782-857cb17429af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ldf7s" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858450 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/473085e8-ee17-4244-abd0-dcf2308b4655-registry-certificates\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858486 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7xkd\" (UniqueName: \"kubernetes.io/projected/da95cd86-f90a-4d7f-a308-4124b22d8427-kube-api-access-r7xkd\") pod \"auto-csr-approver-29566516-6stbk\" (UID: \"da95cd86-f90a-4d7f-a308-4124b22d8427\") " pod="openshift-infra/auto-csr-approver-29566516-6stbk" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858509 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/473085e8-ee17-4244-abd0-dcf2308b4655-trusted-ca\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858523 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/473085e8-ee17-4244-abd0-dcf2308b4655-bound-sa-token\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858548 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/473085e8-ee17-4244-abd0-dcf2308b4655-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858564 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stmzg\" (UniqueName: \"kubernetes.io/projected/d441824b-dc11-4f89-af28-0a5c76439296-kube-api-access-stmzg\") pod \"dns-default-t8g8d\" (UID: \"d441824b-dc11-4f89-af28-0a5c76439296\") " pod="openshift-dns/dns-default-t8g8d" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858579 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/eac86025-7f7e-49a8-ac1b-4bf8c1a65c35-signing-key\") pod \"service-ca-9c57cc56f-jfhjj\" (UID: \"eac86025-7f7e-49a8-ac1b-4bf8c1a65c35\") " pod="openshift-service-ca/service-ca-9c57cc56f-jfhjj" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858594 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxnkp\" (UniqueName: \"kubernetes.io/projected/ecc1f279-eced-4b51-8ded-b7d00d089722-kube-api-access-vxnkp\") pod \"machine-api-operator-5694c8668f-phw2k\" (UID: \"ecc1f279-eced-4b51-8ded-b7d00d089722\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-phw2k" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858640 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dce27735-9fe1-49f2-a05a-c042a4b6db32-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bbrk6\" (UID: \"dce27735-9fe1-49f2-a05a-c042a4b6db32\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbrk6" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858656 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/16b05ee8-fdde-4f11-936e-0982042ccfcf-mountpoint-dir\") pod \"csi-hostpathplugin-8q7f7\" (UID: \"16b05ee8-fdde-4f11-936e-0982042ccfcf\") " pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858672 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00730545-e9b7-4166-9f09-7a6fcac8cad3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j79zb\" (UID: \"00730545-e9b7-4166-9f09-7a6fcac8cad3\") " pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858689 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v89xs\" (UniqueName: \"kubernetes.io/projected/0121f83c-494b-40f1-9a70-65344ed716ad-kube-api-access-v89xs\") pod \"package-server-manager-789f6589d5-5pwjp\" (UID: \"0121f83c-494b-40f1-9a70-65344ed716ad\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pwjp" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858706 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d441824b-dc11-4f89-af28-0a5c76439296-config-volume\") pod \"dns-default-t8g8d\" (UID: \"d441824b-dc11-4f89-af28-0a5c76439296\") " pod="openshift-dns/dns-default-t8g8d" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858723 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d-config\") pod \"machine-approver-56656f9798-64rv4\" (UID: \"6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-64rv4" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858740 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c03762c2-c2af-4472-abb5-5017f75e738f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lwsmc\" (UID: \"c03762c2-c2af-4472-abb5-5017f75e738f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lwsmc" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858757 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/16b05ee8-fdde-4f11-936e-0982042ccfcf-plugins-dir\") pod \"csi-hostpathplugin-8q7f7\" (UID: \"16b05ee8-fdde-4f11-936e-0982042ccfcf\") " pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858792 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d842a173-2088-46ee-bdd3-4f058a2c62e8-cert\") pod \"ingress-canary-49gfv\" (UID: \"d842a173-2088-46ee-bdd3-4f058a2c62e8\") " pod="openshift-ingress-canary/ingress-canary-49gfv" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858810 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xmkj\" (UniqueName: \"kubernetes.io/projected/473085e8-ee17-4244-abd0-dcf2308b4655-kube-api-access-8xmkj\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858836 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be9ae029-fa6d-44b8-9e03-af525859dd09-config\") pod \"service-ca-operator-777779d784-l285h\" (UID: \"be9ae029-fa6d-44b8-9e03-af525859dd09\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-l285h" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858861 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-console-config\") pod \"console-f9d7485db-2zlqs\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858878 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c03762c2-c2af-4472-abb5-5017f75e738f-config\") pod \"kube-controller-manager-operator-78b949d7b-lwsmc\" (UID: \"c03762c2-c2af-4472-abb5-5017f75e738f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lwsmc" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858906 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npzp5\" (UniqueName: \"kubernetes.io/projected/6b71bdcd-f324-489c-a3ae-61ac7648b36a-kube-api-access-npzp5\") pod \"olm-operator-6b444d44fb-hvdhs\" (UID: \"6b71bdcd-f324-489c-a3ae-61ac7648b36a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvdhs" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858922 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq27l\" (UniqueName: \"kubernetes.io/projected/2af8695f-a945-411d-ac95-03191fb3080d-kube-api-access-vq27l\") pod \"collect-profiles-29566515-6hw5x\" (UID: \"2af8695f-a945-411d-ac95-03191fb3080d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-6hw5x" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858940 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkhhz\" (UniqueName: \"kubernetes.io/projected/b75a7366-ff7f-4176-80f2-687c82069d70-kube-api-access-pkhhz\") pod \"migrator-59844c95c7-s629z\" (UID: \"b75a7366-ff7f-4176-80f2-687c82069d70\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s629z" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858957 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1b74013-b7b2-4e6b-a227-3161015b1d80-proxy-tls\") pod \"machine-config-operator-74547568cd-2rp65\" (UID: \"b1b74013-b7b2-4e6b-a227-3161015b1d80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2rp65" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858973 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6b71bdcd-f324-489c-a3ae-61ac7648b36a-srv-cert\") pod \"olm-operator-6b444d44fb-hvdhs\" (UID: \"6b71bdcd-f324-489c-a3ae-61ac7648b36a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvdhs" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.858990 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj5k8\" (UniqueName: \"kubernetes.io/projected/be9ae029-fa6d-44b8-9e03-af525859dd09-kube-api-access-sj5k8\") pod \"service-ca-operator-777779d784-l285h\" (UID: \"be9ae029-fa6d-44b8-9e03-af525859dd09\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-l285h" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.859009 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6799ceeb-28d4-4caf-97e4-e9115baae071-etcd-ca\") pod \"etcd-operator-b45778765-vmnvn\" (UID: \"6799ceeb-28d4-4caf-97e4-e9115baae071\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vmnvn" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.859038 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1f707ea9-be11-4354-99a3-e439dd4e6173-webhook-cert\") pod \"packageserver-d55dfcdfc-gd9xh\" (UID: \"1f707ea9-be11-4354-99a3-e439dd4e6173\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gd9xh" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.859055 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/88f1260c-ea7b-4282-b1af-a7f738cf40b9-profile-collector-cert\") pod \"catalog-operator-68c6474976-n2fsv\" (UID: \"88f1260c-ea7b-4282-b1af-a7f738cf40b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n2fsv" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.859073 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6799ceeb-28d4-4caf-97e4-e9115baae071-etcd-service-ca\") pod \"etcd-operator-b45778765-vmnvn\" (UID: \"6799ceeb-28d4-4caf-97e4-e9115baae071\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vmnvn" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.859091 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2af8695f-a945-411d-ac95-03191fb3080d-secret-volume\") pod \"collect-profiles-29566515-6hw5x\" (UID: \"2af8695f-a945-411d-ac95-03191fb3080d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-6hw5x" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.859108 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecc1f279-eced-4b51-8ded-b7d00d089722-config\") pod \"machine-api-operator-5694c8668f-phw2k\" (UID: \"ecc1f279-eced-4b51-8ded-b7d00d089722\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-phw2k" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.859144 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/16b05ee8-fdde-4f11-936e-0982042ccfcf-socket-dir\") pod \"csi-hostpathplugin-8q7f7\" (UID: \"16b05ee8-fdde-4f11-936e-0982042ccfcf\") " pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.859160 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be9ae029-fa6d-44b8-9e03-af525859dd09-serving-cert\") pod \"service-ca-operator-777779d784-l285h\" (UID: \"be9ae029-fa6d-44b8-9e03-af525859dd09\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-l285h" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.859188 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/da62d543-787a-4364-8271-8f8f9529dd0c-default-certificate\") pod \"router-default-5444994796-dbs7t\" (UID: \"da62d543-787a-4364-8271-8f8f9529dd0c\") " pod="openshift-ingress/router-default-5444994796-dbs7t" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.859205 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da62d543-787a-4364-8271-8f8f9529dd0c-service-ca-bundle\") pod \"router-default-5444994796-dbs7t\" (UID: \"da62d543-787a-4364-8271-8f8f9529dd0c\") " pod="openshift-ingress/router-default-5444994796-dbs7t" Mar 20 07:16:09 crc kubenswrapper[4749]: E0320 07:16:09.859879 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:10.359857713 +0000 UTC m=+206.909515360 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.860097 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da62d543-787a-4364-8271-8f8f9529dd0c-service-ca-bundle\") pod \"router-default-5444994796-dbs7t\" (UID: \"da62d543-787a-4364-8271-8f8f9529dd0c\") " pod="openshift-ingress/router-default-5444994796-dbs7t" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.860785 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-service-ca\") pod \"console-f9d7485db-2zlqs\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.862513 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/657acde5-fe52-4eaa-812c-00914daf93ba-config\") pod \"kube-apiserver-operator-766d6c64bb-c9ncz\" (UID: \"657acde5-fe52-4eaa-812c-00914daf93ba\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9ncz" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.863481 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/473085e8-ee17-4244-abd0-dcf2308b4655-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.865186 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-trusted-ca-bundle\") pod \"console-f9d7485db-2zlqs\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.866102 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-oauth-serving-cert\") pod \"console-f9d7485db-2zlqs\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.866552 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/00730545-e9b7-4166-9f09-7a6fcac8cad3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j79zb\" (UID: \"00730545-e9b7-4166-9f09-7a6fcac8cad3\") " pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.866814 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1f707ea9-be11-4354-99a3-e439dd4e6173-tmpfs\") pod \"packageserver-d55dfcdfc-gd9xh\" (UID: \"1f707ea9-be11-4354-99a3-e439dd4e6173\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gd9xh" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.868066 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16c3a232-5504-4648-a65b-2a0d89126e22-trusted-ca\") pod \"ingress-operator-5b745b69d9-jbqrm\" (UID: \"16c3a232-5504-4648-a65b-2a0d89126e22\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jbqrm" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.870478 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e8fb037-3e85-4c5a-a782-857cb17429af-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ldf7s\" (UID: \"2e8fb037-3e85-4c5a-a782-857cb17429af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ldf7s" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.875834 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da62d543-787a-4364-8271-8f8f9529dd0c-metrics-certs\") pod \"router-default-5444994796-dbs7t\" (UID: \"da62d543-787a-4364-8271-8f8f9529dd0c\") " pod="openshift-ingress/router-default-5444994796-dbs7t" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.876473 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6799ceeb-28d4-4caf-97e4-e9115baae071-config\") pod \"etcd-operator-b45778765-vmnvn\" (UID: \"6799ceeb-28d4-4caf-97e4-e9115baae071\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vmnvn" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.877590 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1f707ea9-be11-4354-99a3-e439dd4e6173-apiservice-cert\") pod \"packageserver-d55dfcdfc-gd9xh\" (UID: \"1f707ea9-be11-4354-99a3-e439dd4e6173\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gd9xh" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.877838 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d-auth-proxy-config\") pod \"machine-approver-56656f9798-64rv4\" (UID: \"6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-64rv4" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.878475 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6799ceeb-28d4-4caf-97e4-e9115baae071-etcd-service-ca\") pod \"etcd-operator-b45778765-vmnvn\" (UID: \"6799ceeb-28d4-4caf-97e4-e9115baae071\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vmnvn" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.878926 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00730545-e9b7-4166-9f09-7a6fcac8cad3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j79zb\" (UID: \"00730545-e9b7-4166-9f09-7a6fcac8cad3\") " pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.879230 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d-config\") pod \"machine-approver-56656f9798-64rv4\" (UID: \"6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-64rv4" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.881959 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b1b74013-b7b2-4e6b-a227-3161015b1d80-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2rp65\" (UID: \"b1b74013-b7b2-4e6b-a227-3161015b1d80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2rp65" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.883884 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b1b74013-b7b2-4e6b-a227-3161015b1d80-images\") pod \"machine-config-operator-74547568cd-2rp65\" (UID: \"b1b74013-b7b2-4e6b-a227-3161015b1d80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2rp65" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.884230 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1b74013-b7b2-4e6b-a227-3161015b1d80-proxy-tls\") pod \"machine-config-operator-74547568cd-2rp65\" (UID: \"b1b74013-b7b2-4e6b-a227-3161015b1d80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2rp65" Mar 20 07:16:09 crc kubenswrapper[4749]: W0320 07:16:09.884228 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38b3f23d_6db5_4788_bcd5_810450677cd6.slice/crio-f49ba121379c7d5198c293dcb95afaaf4bb879014adafa6923fcf7f2c9f0066a WatchSource:0}: Error finding container f49ba121379c7d5198c293dcb95afaaf4bb879014adafa6923fcf7f2c9f0066a: Status 404 returned error can't find the container with id f49ba121379c7d5198c293dcb95afaaf4bb879014adafa6923fcf7f2c9f0066a Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.884324 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/88f1260c-ea7b-4282-b1af-a7f738cf40b9-srv-cert\") pod \"catalog-operator-68c6474976-n2fsv\" (UID: \"88f1260c-ea7b-4282-b1af-a7f738cf40b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n2fsv" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.884617 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c03762c2-c2af-4472-abb5-5017f75e738f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lwsmc\" (UID: \"c03762c2-c2af-4472-abb5-5017f75e738f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lwsmc" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.885355 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6799ceeb-28d4-4caf-97e4-e9115baae071-etcd-ca\") pod \"etcd-operator-b45778765-vmnvn\" (UID: \"6799ceeb-28d4-4caf-97e4-e9115baae071\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vmnvn" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.886191 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/88f1260c-ea7b-4282-b1af-a7f738cf40b9-profile-collector-cert\") pod \"catalog-operator-68c6474976-n2fsv\" (UID: \"88f1260c-ea7b-4282-b1af-a7f738cf40b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n2fsv" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.887366 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dce27735-9fe1-49f2-a05a-c042a4b6db32-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bbrk6\" (UID: \"dce27735-9fe1-49f2-a05a-c042a4b6db32\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbrk6" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.889575 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e8fb037-3e85-4c5a-a782-857cb17429af-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ldf7s\" (UID: \"2e8fb037-3e85-4c5a-a782-857cb17429af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ldf7s" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.890734 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d-machine-approver-tls\") pod \"machine-approver-56656f9798-64rv4\" (UID: \"6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-64rv4" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.891833 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16c3a232-5504-4648-a65b-2a0d89126e22-metrics-tls\") pod \"ingress-operator-5b745b69d9-jbqrm\" (UID: \"16c3a232-5504-4648-a65b-2a0d89126e22\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jbqrm" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.892480 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/da62d543-787a-4364-8271-8f8f9529dd0c-default-certificate\") pod \"router-default-5444994796-dbs7t\" (UID: \"da62d543-787a-4364-8271-8f8f9529dd0c\") " pod="openshift-ingress/router-default-5444994796-dbs7t" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.893814 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/657acde5-fe52-4eaa-812c-00914daf93ba-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-c9ncz\" (UID: \"657acde5-fe52-4eaa-812c-00914daf93ba\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9ncz" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.894183 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c03762c2-c2af-4472-abb5-5017f75e738f-config\") pod \"kube-controller-manager-operator-78b949d7b-lwsmc\" (UID: \"c03762c2-c2af-4472-abb5-5017f75e738f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lwsmc" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.895531 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgkxj\" (UniqueName: \"kubernetes.io/projected/6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d-kube-api-access-hgkxj\") pod \"machine-approver-56656f9798-64rv4\" (UID: \"6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-64rv4" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.897729 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/da62d543-787a-4364-8271-8f8f9529dd0c-stats-auth\") pod \"router-default-5444994796-dbs7t\" (UID: \"da62d543-787a-4364-8271-8f8f9529dd0c\") " pod="openshift-ingress/router-default-5444994796-dbs7t" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.898296 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1f707ea9-be11-4354-99a3-e439dd4e6173-webhook-cert\") pod \"packageserver-d55dfcdfc-gd9xh\" (UID: \"1f707ea9-be11-4354-99a3-e439dd4e6173\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gd9xh" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.898640 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a758ef11-ae4c-4d21-96b4-0a8bded670a3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bsgpp\" (UID: \"a758ef11-ae4c-4d21-96b4-0a8bded670a3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bsgpp" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.901225 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/473085e8-ee17-4244-abd0-dcf2308b4655-registry-tls\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.901571 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-console-config\") pod \"console-f9d7485db-2zlqs\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.901952 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-console-serving-cert\") pod \"console-f9d7485db-2zlqs\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.903542 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a758ef11-ae4c-4d21-96b4-0a8bded670a3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bsgpp\" (UID: \"a758ef11-ae4c-4d21-96b4-0a8bded670a3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bsgpp" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.904818 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/473085e8-ee17-4244-abd0-dcf2308b4655-trusted-ca\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.905565 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/473085e8-ee17-4244-abd0-dcf2308b4655-registry-certificates\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.912667 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dce27735-9fe1-49f2-a05a-c042a4b6db32-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bbrk6\" (UID: \"dce27735-9fe1-49f2-a05a-c042a4b6db32\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbrk6" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.912668 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6799ceeb-28d4-4caf-97e4-e9115baae071-etcd-client\") pod \"etcd-operator-b45778765-vmnvn\" (UID: \"6799ceeb-28d4-4caf-97e4-e9115baae071\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vmnvn" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.913609 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-console-oauth-config\") pod \"console-f9d7485db-2zlqs\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.919601 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wk7m\" (UniqueName: \"kubernetes.io/projected/da62d543-787a-4364-8271-8f8f9529dd0c-kube-api-access-9wk7m\") pod \"router-default-5444994796-dbs7t\" (UID: \"da62d543-787a-4364-8271-8f8f9529dd0c\") " pod="openshift-ingress/router-default-5444994796-dbs7t" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.920105 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/473085e8-ee17-4244-abd0-dcf2308b4655-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.922950 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-952x2"] Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.926183 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6799ceeb-28d4-4caf-97e4-e9115baae071-serving-cert\") pod \"etcd-operator-b45778765-vmnvn\" (UID: \"6799ceeb-28d4-4caf-97e4-e9115baae071\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vmnvn" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.934790 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbnzl\" (UniqueName: \"kubernetes.io/projected/dce27735-9fe1-49f2-a05a-c042a4b6db32-kube-api-access-lbnzl\") pod \"openshift-controller-manager-operator-756b6f6bc6-bbrk6\" (UID: \"dce27735-9fe1-49f2-a05a-c042a4b6db32\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbrk6" Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.946688 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z5lbj" event={"ID":"815dbd4c-68ea-43e3-a355-1658ccdccd22","Type":"ContainerStarted","Data":"e12f322c2543fd293beabf87a9983f9c04c2cc673b3fda5caac1b382aeb36297"} Mar 20 07:16:09 crc kubenswrapper[4749]: I0320 07:16:09.951634 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/657acde5-fe52-4eaa-812c-00914daf93ba-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-c9ncz\" (UID: \"657acde5-fe52-4eaa-812c-00914daf93ba\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9ncz" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.952900 4749 generic.go:334] "Generic (PLEG): container finished" podID="d999d3d0-14e4-4759-98ab-a6d11011ca86" containerID="777f658217833ccd6308eb6e1f00cb96e65b55e3cceec3bb0b802d8e170568dd" exitCode=0 Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.952948 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-9gg28" event={"ID":"d999d3d0-14e4-4759-98ab-a6d11011ca86","Type":"ContainerDied","Data":"777f658217833ccd6308eb6e1f00cb96e65b55e3cceec3bb0b802d8e170568dd"} Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.952972 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-9gg28" event={"ID":"d999d3d0-14e4-4759-98ab-a6d11011ca86","Type":"ContainerStarted","Data":"65a8dffdebc8cf32de98d8d29e2137e9c25c107deccc9bb957255fb8f13ceec7"} Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.953702 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-dbs7t" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.954777 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gbwxf" event={"ID":"d1e24d59-bf58-421f-81a7-cc04d151fdd5","Type":"ContainerStarted","Data":"328ef8e74203a257701e4ec00e15c90aedc6857c51450e3819f62e823e40b309"} Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.956301 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" event={"ID":"4f9b9110-5f21-4d71-ac4e-61e0ff6b1899","Type":"ContainerStarted","Data":"359237dd72e0529d1d892951a9f6ef7a321163f81dd3b69bb08b325231498b3e"} Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.956323 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" event={"ID":"4f9b9110-5f21-4d71-ac4e-61e0ff6b1899","Type":"ContainerStarted","Data":"e0f33c911e139840a1235c5f3c91e2668c2965240297820c5ba107f1bec7e364"} Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.956509 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.965470 4749 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-jvtqf container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.965524 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" podUID="4f9b9110-5f21-4d71-ac4e-61e0ff6b1899" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966013 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d842a173-2088-46ee-bdd3-4f058a2c62e8-cert\") pod \"ingress-canary-49gfv\" (UID: \"d842a173-2088-46ee-bdd3-4f058a2c62e8\") " pod="openshift-ingress-canary/ingress-canary-49gfv" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966035 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/16b05ee8-fdde-4f11-936e-0982042ccfcf-plugins-dir\") pod \"csi-hostpathplugin-8q7f7\" (UID: \"16b05ee8-fdde-4f11-936e-0982042ccfcf\") " pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966061 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be9ae029-fa6d-44b8-9e03-af525859dd09-config\") pod \"service-ca-operator-777779d784-l285h\" (UID: \"be9ae029-fa6d-44b8-9e03-af525859dd09\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-l285h" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966079 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npzp5\" (UniqueName: \"kubernetes.io/projected/6b71bdcd-f324-489c-a3ae-61ac7648b36a-kube-api-access-npzp5\") pod \"olm-operator-6b444d44fb-hvdhs\" (UID: \"6b71bdcd-f324-489c-a3ae-61ac7648b36a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvdhs" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966094 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq27l\" (UniqueName: \"kubernetes.io/projected/2af8695f-a945-411d-ac95-03191fb3080d-kube-api-access-vq27l\") pod \"collect-profiles-29566515-6hw5x\" (UID: \"2af8695f-a945-411d-ac95-03191fb3080d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-6hw5x" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966118 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6b71bdcd-f324-489c-a3ae-61ac7648b36a-srv-cert\") pod \"olm-operator-6b444d44fb-hvdhs\" (UID: \"6b71bdcd-f324-489c-a3ae-61ac7648b36a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvdhs" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966135 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj5k8\" (UniqueName: \"kubernetes.io/projected/be9ae029-fa6d-44b8-9e03-af525859dd09-kube-api-access-sj5k8\") pod \"service-ca-operator-777779d784-l285h\" (UID: \"be9ae029-fa6d-44b8-9e03-af525859dd09\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-l285h" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966153 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2af8695f-a945-411d-ac95-03191fb3080d-secret-volume\") pod \"collect-profiles-29566515-6hw5x\" (UID: \"2af8695f-a945-411d-ac95-03191fb3080d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-6hw5x" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966171 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecc1f279-eced-4b51-8ded-b7d00d089722-config\") pod \"machine-api-operator-5694c8668f-phw2k\" (UID: \"ecc1f279-eced-4b51-8ded-b7d00d089722\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-phw2k" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966187 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/16b05ee8-fdde-4f11-936e-0982042ccfcf-socket-dir\") pod \"csi-hostpathplugin-8q7f7\" (UID: \"16b05ee8-fdde-4f11-936e-0982042ccfcf\") " pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966202 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be9ae029-fa6d-44b8-9e03-af525859dd09-serving-cert\") pod \"service-ca-operator-777779d784-l285h\" (UID: \"be9ae029-fa6d-44b8-9e03-af525859dd09\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-l285h" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966219 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/928b2eb3-aeb6-411f-b215-a33551894e85-certs\") pod \"machine-config-server-7f66l\" (UID: \"928b2eb3-aeb6-411f-b215-a33551894e85\") " pod="openshift-machine-config-operator/machine-config-server-7f66l" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966233 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/16b05ee8-fdde-4f11-936e-0982042ccfcf-registration-dir\") pod \"csi-hostpathplugin-8q7f7\" (UID: \"16b05ee8-fdde-4f11-936e-0982042ccfcf\") " pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966248 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdg2k\" (UniqueName: \"kubernetes.io/projected/928b2eb3-aeb6-411f-b215-a33551894e85-kube-api-access-jdg2k\") pod \"machine-config-server-7f66l\" (UID: \"928b2eb3-aeb6-411f-b215-a33551894e85\") " pod="openshift-machine-config-operator/machine-config-server-7f66l" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966271 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ecc1f279-eced-4b51-8ded-b7d00d089722-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-phw2k\" (UID: \"ecc1f279-eced-4b51-8ded-b7d00d089722\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-phw2k" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966299 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ecc1f279-eced-4b51-8ded-b7d00d089722-images\") pod \"machine-api-operator-5694c8668f-phw2k\" (UID: \"ecc1f279-eced-4b51-8ded-b7d00d089722\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-phw2k" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966315 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpgj5\" (UniqueName: \"kubernetes.io/projected/eac86025-7f7e-49a8-ac1b-4bf8c1a65c35-kube-api-access-tpgj5\") pod \"service-ca-9c57cc56f-jfhjj\" (UID: \"eac86025-7f7e-49a8-ac1b-4bf8c1a65c35\") " pod="openshift-service-ca/service-ca-9c57cc56f-jfhjj" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966375 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/928b2eb3-aeb6-411f-b215-a33551894e85-node-bootstrap-token\") pod \"machine-config-server-7f66l\" (UID: \"928b2eb3-aeb6-411f-b215-a33551894e85\") " pod="openshift-machine-config-operator/machine-config-server-7f66l" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966399 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/eac86025-7f7e-49a8-ac1b-4bf8c1a65c35-signing-cabundle\") pod \"service-ca-9c57cc56f-jfhjj\" (UID: \"eac86025-7f7e-49a8-ac1b-4bf8c1a65c35\") " pod="openshift-service-ca/service-ca-9c57cc56f-jfhjj" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966416 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6b71bdcd-f324-489c-a3ae-61ac7648b36a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hvdhs\" (UID: \"6b71bdcd-f324-489c-a3ae-61ac7648b36a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvdhs" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966431 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6znw7\" (UniqueName: \"kubernetes.io/projected/d842a173-2088-46ee-bdd3-4f058a2c62e8-kube-api-access-6znw7\") pod \"ingress-canary-49gfv\" (UID: \"d842a173-2088-46ee-bdd3-4f058a2c62e8\") " pod="openshift-ingress-canary/ingress-canary-49gfv" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966447 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d441824b-dc11-4f89-af28-0a5c76439296-metrics-tls\") pod \"dns-default-t8g8d\" (UID: \"d441824b-dc11-4f89-af28-0a5c76439296\") " pod="openshift-dns/dns-default-t8g8d" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966474 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2af8695f-a945-411d-ac95-03191fb3080d-config-volume\") pod \"collect-profiles-29566515-6hw5x\" (UID: \"2af8695f-a945-411d-ac95-03191fb3080d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-6hw5x" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966490 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9qln\" (UniqueName: \"kubernetes.io/projected/16b05ee8-fdde-4f11-936e-0982042ccfcf-kube-api-access-f9qln\") pod \"csi-hostpathplugin-8q7f7\" (UID: \"16b05ee8-fdde-4f11-936e-0982042ccfcf\") " pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966524 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0121f83c-494b-40f1-9a70-65344ed716ad-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5pwjp\" (UID: \"0121f83c-494b-40f1-9a70-65344ed716ad\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pwjp" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966553 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/16b05ee8-fdde-4f11-936e-0982042ccfcf-csi-data-dir\") pod \"csi-hostpathplugin-8q7f7\" (UID: \"16b05ee8-fdde-4f11-936e-0982042ccfcf\") " pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966573 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966607 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stmzg\" (UniqueName: \"kubernetes.io/projected/d441824b-dc11-4f89-af28-0a5c76439296-kube-api-access-stmzg\") pod \"dns-default-t8g8d\" (UID: \"d441824b-dc11-4f89-af28-0a5c76439296\") " pod="openshift-dns/dns-default-t8g8d" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966620 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/eac86025-7f7e-49a8-ac1b-4bf8c1a65c35-signing-key\") pod \"service-ca-9c57cc56f-jfhjj\" (UID: \"eac86025-7f7e-49a8-ac1b-4bf8c1a65c35\") " pod="openshift-service-ca/service-ca-9c57cc56f-jfhjj" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966635 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxnkp\" (UniqueName: \"kubernetes.io/projected/ecc1f279-eced-4b51-8ded-b7d00d089722-kube-api-access-vxnkp\") pod \"machine-api-operator-5694c8668f-phw2k\" (UID: \"ecc1f279-eced-4b51-8ded-b7d00d089722\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-phw2k" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966653 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/16b05ee8-fdde-4f11-936e-0982042ccfcf-mountpoint-dir\") pod \"csi-hostpathplugin-8q7f7\" (UID: \"16b05ee8-fdde-4f11-936e-0982042ccfcf\") " pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966670 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v89xs\" (UniqueName: \"kubernetes.io/projected/0121f83c-494b-40f1-9a70-65344ed716ad-kube-api-access-v89xs\") pod \"package-server-manager-789f6589d5-5pwjp\" (UID: \"0121f83c-494b-40f1-9a70-65344ed716ad\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pwjp" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.966686 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d441824b-dc11-4f89-af28-0a5c76439296-config-volume\") pod \"dns-default-t8g8d\" (UID: \"d441824b-dc11-4f89-af28-0a5c76439296\") " pod="openshift-dns/dns-default-t8g8d" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.967214 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d441824b-dc11-4f89-af28-0a5c76439296-config-volume\") pod \"dns-default-t8g8d\" (UID: \"d441824b-dc11-4f89-af28-0a5c76439296\") " pod="openshift-dns/dns-default-t8g8d" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.967957 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecc1f279-eced-4b51-8ded-b7d00d089722-config\") pod \"machine-api-operator-5694c8668f-phw2k\" (UID: \"ecc1f279-eced-4b51-8ded-b7d00d089722\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-phw2k" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.968113 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/16b05ee8-fdde-4f11-936e-0982042ccfcf-socket-dir\") pod \"csi-hostpathplugin-8q7f7\" (UID: \"16b05ee8-fdde-4f11-936e-0982042ccfcf\") " pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.969838 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2af8695f-a945-411d-ac95-03191fb3080d-secret-volume\") pod \"collect-profiles-29566515-6hw5x\" (UID: \"2af8695f-a945-411d-ac95-03191fb3080d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-6hw5x" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.969870 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6b71bdcd-f324-489c-a3ae-61ac7648b36a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hvdhs\" (UID: \"6b71bdcd-f324-489c-a3ae-61ac7648b36a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvdhs" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.969957 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" event={"ID":"38b3f23d-6db5-4788-bcd5-810450677cd6","Type":"ContainerStarted","Data":"f49ba121379c7d5198c293dcb95afaaf4bb879014adafa6923fcf7f2c9f0066a"} Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.970112 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/16b05ee8-fdde-4f11-936e-0982042ccfcf-registration-dir\") pod \"csi-hostpathplugin-8q7f7\" (UID: \"16b05ee8-fdde-4f11-936e-0982042ccfcf\") " pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.970693 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ecc1f279-eced-4b51-8ded-b7d00d089722-images\") pod \"machine-api-operator-5694c8668f-phw2k\" (UID: \"ecc1f279-eced-4b51-8ded-b7d00d089722\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-phw2k" Mar 20 07:16:10 crc kubenswrapper[4749]: E0320 07:16:09.970947 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:10.470935377 +0000 UTC m=+207.020593024 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.971034 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/928b2eb3-aeb6-411f-b215-a33551894e85-certs\") pod \"machine-config-server-7f66l\" (UID: \"928b2eb3-aeb6-411f-b215-a33551894e85\") " pod="openshift-machine-config-operator/machine-config-server-7f66l" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.971525 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2af8695f-a945-411d-ac95-03191fb3080d-config-volume\") pod \"collect-profiles-29566515-6hw5x\" (UID: \"2af8695f-a945-411d-ac95-03191fb3080d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-6hw5x" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.971588 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/16b05ee8-fdde-4f11-936e-0982042ccfcf-csi-data-dir\") pod \"csi-hostpathplugin-8q7f7\" (UID: \"16b05ee8-fdde-4f11-936e-0982042ccfcf\") " pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.973588 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/16b05ee8-fdde-4f11-936e-0982042ccfcf-mountpoint-dir\") pod \"csi-hostpathplugin-8q7f7\" (UID: \"16b05ee8-fdde-4f11-936e-0982042ccfcf\") " pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.973625 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/16b05ee8-fdde-4f11-936e-0982042ccfcf-plugins-dir\") pod \"csi-hostpathplugin-8q7f7\" (UID: \"16b05ee8-fdde-4f11-936e-0982042ccfcf\") " pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.974133 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be9ae029-fa6d-44b8-9e03-af525859dd09-config\") pod \"service-ca-operator-777779d784-l285h\" (UID: \"be9ae029-fa6d-44b8-9e03-af525859dd09\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-l285h" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.975162 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/eac86025-7f7e-49a8-ac1b-4bf8c1a65c35-signing-cabundle\") pod \"service-ca-9c57cc56f-jfhjj\" (UID: \"eac86025-7f7e-49a8-ac1b-4bf8c1a65c35\") " pod="openshift-service-ca/service-ca-9c57cc56f-jfhjj" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.975735 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ecc1f279-eced-4b51-8ded-b7d00d089722-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-phw2k\" (UID: \"ecc1f279-eced-4b51-8ded-b7d00d089722\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-phw2k" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:09.980191 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9ncz" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.000348 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/928b2eb3-aeb6-411f-b215-a33551894e85-node-bootstrap-token\") pod \"machine-config-server-7f66l\" (UID: \"928b2eb3-aeb6-411f-b215-a33551894e85\") " pod="openshift-machine-config-operator/machine-config-server-7f66l" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.000558 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/eac86025-7f7e-49a8-ac1b-4bf8c1a65c35-signing-key\") pod \"service-ca-9c57cc56f-jfhjj\" (UID: \"eac86025-7f7e-49a8-ac1b-4bf8c1a65c35\") " pod="openshift-service-ca/service-ca-9c57cc56f-jfhjj" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.007582 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0121f83c-494b-40f1-9a70-65344ed716ad-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5pwjp\" (UID: \"0121f83c-494b-40f1-9a70-65344ed716ad\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pwjp" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.011890 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d441824b-dc11-4f89-af28-0a5c76439296-metrics-tls\") pod \"dns-default-t8g8d\" (UID: \"d441824b-dc11-4f89-af28-0a5c76439296\") " pod="openshift-dns/dns-default-t8g8d" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.013628 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6b71bdcd-f324-489c-a3ae-61ac7648b36a-srv-cert\") pod \"olm-operator-6b444d44fb-hvdhs\" (UID: \"6b71bdcd-f324-489c-a3ae-61ac7648b36a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvdhs" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.024259 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnwj5\" (UniqueName: \"kubernetes.io/projected/b1b74013-b7b2-4e6b-a227-3161015b1d80-kube-api-access-mnwj5\") pod \"machine-config-operator-74547568cd-2rp65\" (UID: \"b1b74013-b7b2-4e6b-a227-3161015b1d80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2rp65" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.024314 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44vbg\" (UniqueName: \"kubernetes.io/projected/a758ef11-ae4c-4d21-96b4-0a8bded670a3-kube-api-access-44vbg\") pod \"openshift-apiserver-operator-796bbdcf4f-bsgpp\" (UID: \"a758ef11-ae4c-4d21-96b4-0a8bded670a3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bsgpp" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.024360 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs7lf\" (UniqueName: \"kubernetes.io/projected/2e8fb037-3e85-4c5a-a782-857cb17429af-kube-api-access-fs7lf\") pod \"kube-storage-version-migrator-operator-b67b599dd-ldf7s\" (UID: \"2e8fb037-3e85-4c5a-a782-857cb17429af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ldf7s" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.024648 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d842a173-2088-46ee-bdd3-4f058a2c62e8-cert\") pod \"ingress-canary-49gfv\" (UID: \"d842a173-2088-46ee-bdd3-4f058a2c62e8\") " pod="openshift-ingress-canary/ingress-canary-49gfv" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.025636 4749 generic.go:334] "Generic (PLEG): container finished" podID="309f1b8f-63a4-4019-b8f0-500dc7b60c8d" containerID="f996450503608b821ceda1072de5de72ef3f33c88bfda551509077d95cbaa822" exitCode=0 Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.025677 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5gk7" event={"ID":"309f1b8f-63a4-4019-b8f0-500dc7b60c8d","Type":"ContainerDied","Data":"f996450503608b821ceda1072de5de72ef3f33c88bfda551509077d95cbaa822"} Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.025714 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5gk7" event={"ID":"309f1b8f-63a4-4019-b8f0-500dc7b60c8d","Type":"ContainerStarted","Data":"343420ed49abb1967b6424605a03e754e0a11eb5338b65450f28b102e619dc60"} Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.032973 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkkg8\" (UniqueName: \"kubernetes.io/projected/6799ceeb-28d4-4caf-97e4-e9115baae071-kube-api-access-fkkg8\") pod \"etcd-operator-b45778765-vmnvn\" (UID: \"6799ceeb-28d4-4caf-97e4-e9115baae071\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vmnvn" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.040248 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be9ae029-fa6d-44b8-9e03-af525859dd09-serving-cert\") pod \"service-ca-operator-777779d784-l285h\" (UID: \"be9ae029-fa6d-44b8-9e03-af525859dd09\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-l285h" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.041948 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bjx8c"] Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.048598 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2rp65" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.050163 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnxph\" (UniqueName: \"kubernetes.io/projected/00730545-e9b7-4166-9f09-7a6fcac8cad3-kube-api-access-fnxph\") pod \"marketplace-operator-79b997595-j79zb\" (UID: \"00730545-e9b7-4166-9f09-7a6fcac8cad3\") " pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.059605 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ldf7s" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.067079 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:10 crc kubenswrapper[4749]: E0320 07:16:10.067328 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:10.567308725 +0000 UTC m=+207.116966372 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.067469 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:10 crc kubenswrapper[4749]: E0320 07:16:10.069722 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:10.569705747 +0000 UTC m=+207.119363394 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.070596 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx7wr\" (UniqueName: \"kubernetes.io/projected/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-kube-api-access-cx7wr\") pod \"console-f9d7485db-2zlqs\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.107946 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc5p9\" (UniqueName: \"kubernetes.io/projected/1f707ea9-be11-4354-99a3-e439dd4e6173-kube-api-access-hc5p9\") pod \"packageserver-d55dfcdfc-gd9xh\" (UID: \"1f707ea9-be11-4354-99a3-e439dd4e6173\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gd9xh" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.144722 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkhhz\" (UniqueName: \"kubernetes.io/projected/b75a7366-ff7f-4176-80f2-687c82069d70-kube-api-access-pkhhz\") pod \"migrator-59844c95c7-s629z\" (UID: \"b75a7366-ff7f-4176-80f2-687c82069d70\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s629z" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.145800 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl"] Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.146823 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c03762c2-c2af-4472-abb5-5017f75e738f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lwsmc\" (UID: \"c03762c2-c2af-4472-abb5-5017f75e738f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lwsmc" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.148719 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hvf29"] Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.168972 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.169895 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bsgpp" Mar 20 07:16:10 crc kubenswrapper[4749]: E0320 07:16:10.169982 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:10.669950203 +0000 UTC m=+207.219607850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.175040 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xmkj\" (UniqueName: \"kubernetes.io/projected/473085e8-ee17-4244-abd0-dcf2308b4655-kube-api-access-8xmkj\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.175710 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.177917 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6sv2z"] Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.184441 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-64rv4" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.202216 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgsdt\" (UniqueName: \"kubernetes.io/projected/88f1260c-ea7b-4282-b1af-a7f738cf40b9-kube-api-access-sgsdt\") pod \"catalog-operator-68c6474976-n2fsv\" (UID: \"88f1260c-ea7b-4282-b1af-a7f738cf40b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n2fsv" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.204572 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbrk6" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.207818 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nv7hv"] Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.207867 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-lzpsv"] Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.215683 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-vmnvn" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.218987 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnf9s\" (UniqueName: \"kubernetes.io/projected/16c3a232-5504-4648-a65b-2a0d89126e22-kube-api-access-lnf9s\") pod \"ingress-operator-5b745b69d9-jbqrm\" (UID: \"16c3a232-5504-4648-a65b-2a0d89126e22\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jbqrm" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.225753 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lwsmc" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.234324 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/473085e8-ee17-4244-abd0-dcf2308b4655-bound-sa-token\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.237949 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9ncz"] Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.240566 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.265273 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gd9xh" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.280646 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hlw9w"] Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.280826 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:10 crc kubenswrapper[4749]: E0320 07:16:10.281557 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:10.781545251 +0000 UTC m=+207.331202888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.288780 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7xkd\" (UniqueName: \"kubernetes.io/projected/da95cd86-f90a-4d7f-a308-4124b22d8427-kube-api-access-r7xkd\") pod \"auto-csr-approver-29566516-6stbk\" (UID: \"da95cd86-f90a-4d7f-a308-4124b22d8427\") " pod="openshift-infra/auto-csr-approver-29566516-6stbk" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.291084 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-blksh"] Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.302963 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdg2k\" (UniqueName: \"kubernetes.io/projected/928b2eb3-aeb6-411f-b215-a33551894e85-kube-api-access-jdg2k\") pod \"machine-config-server-7f66l\" (UID: \"928b2eb3-aeb6-411f-b215-a33551894e85\") " pod="openshift-machine-config-operator/machine-config-server-7f66l" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.304469 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16c3a232-5504-4648-a65b-2a0d89126e22-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jbqrm\" (UID: \"16c3a232-5504-4648-a65b-2a0d89126e22\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jbqrm" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.307686 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566516-6stbk" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.313093 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6znw7\" (UniqueName: \"kubernetes.io/projected/d842a173-2088-46ee-bdd3-4f058a2c62e8-kube-api-access-6znw7\") pod \"ingress-canary-49gfv\" (UID: \"d842a173-2088-46ee-bdd3-4f058a2c62e8\") " pod="openshift-ingress-canary/ingress-canary-49gfv" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.313242 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bqpst"] Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.316258 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s629z" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.329896 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2rp65"] Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.330887 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n2fsv" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.333958 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npzp5\" (UniqueName: \"kubernetes.io/projected/6b71bdcd-f324-489c-a3ae-61ac7648b36a-kube-api-access-npzp5\") pod \"olm-operator-6b444d44fb-hvdhs\" (UID: \"6b71bdcd-f324-489c-a3ae-61ac7648b36a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvdhs" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.354880 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj5k8\" (UniqueName: \"kubernetes.io/projected/be9ae029-fa6d-44b8-9e03-af525859dd09-kube-api-access-sj5k8\") pod \"service-ca-operator-777779d784-l285h\" (UID: \"be9ae029-fa6d-44b8-9e03-af525859dd09\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-l285h" Mar 20 07:16:10 crc kubenswrapper[4749]: W0320 07:16:10.356430 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16e1bdb5_47b7_40e0_bc4b_cdd87976f461.slice/crio-00d476947825b5f8724c5f961407292746b9d96f33ff318220e019e98809730c WatchSource:0}: Error finding container 00d476947825b5f8724c5f961407292746b9d96f33ff318220e019e98809730c: Status 404 returned error can't find the container with id 00d476947825b5f8724c5f961407292746b9d96f33ff318220e019e98809730c Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.368085 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq27l\" (UniqueName: \"kubernetes.io/projected/2af8695f-a945-411d-ac95-03191fb3080d-kube-api-access-vq27l\") pod \"collect-profiles-29566515-6hw5x\" (UID: \"2af8695f-a945-411d-ac95-03191fb3080d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-6hw5x" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.377142 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-l285h" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.381663 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:10 crc kubenswrapper[4749]: E0320 07:16:10.382014 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:10.881986893 +0000 UTC m=+207.431644530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.382680 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:10 crc kubenswrapper[4749]: E0320 07:16:10.383265 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:10.883250196 +0000 UTC m=+207.432907843 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.384798 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvdhs" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.393186 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpgj5\" (UniqueName: \"kubernetes.io/projected/eac86025-7f7e-49a8-ac1b-4bf8c1a65c35-kube-api-access-tpgj5\") pod \"service-ca-9c57cc56f-jfhjj\" (UID: \"eac86025-7f7e-49a8-ac1b-4bf8c1a65c35\") " pod="openshift-service-ca/service-ca-9c57cc56f-jfhjj" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.393575 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-6hw5x" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.396786 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ldf7s"] Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.402126 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-jfhjj" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.420338 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9qln\" (UniqueName: \"kubernetes.io/projected/16b05ee8-fdde-4f11-936e-0982042ccfcf-kube-api-access-f9qln\") pod \"csi-hostpathplugin-8q7f7\" (UID: \"16b05ee8-fdde-4f11-936e-0982042ccfcf\") " pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.431210 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxnkp\" (UniqueName: \"kubernetes.io/projected/ecc1f279-eced-4b51-8ded-b7d00d089722-kube-api-access-vxnkp\") pod \"machine-api-operator-5694c8668f-phw2k\" (UID: \"ecc1f279-eced-4b51-8ded-b7d00d089722\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-phw2k" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.450931 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-49gfv" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.451645 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stmzg\" (UniqueName: \"kubernetes.io/projected/d441824b-dc11-4f89-af28-0a5c76439296-kube-api-access-stmzg\") pod \"dns-default-t8g8d\" (UID: \"d441824b-dc11-4f89-af28-0a5c76439296\") " pod="openshift-dns/dns-default-t8g8d" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.467298 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v89xs\" (UniqueName: \"kubernetes.io/projected/0121f83c-494b-40f1-9a70-65344ed716ad-kube-api-access-v89xs\") pod \"package-server-manager-789f6589d5-5pwjp\" (UID: \"0121f83c-494b-40f1-9a70-65344ed716ad\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pwjp" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.485564 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:10 crc kubenswrapper[4749]: E0320 07:16:10.485931 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:10.985916905 +0000 UTC m=+207.535574552 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.486483 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t8g8d" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.490537 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jbqrm" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.494137 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.515588 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7f66l" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.591436 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:10 crc kubenswrapper[4749]: E0320 07:16:10.591936 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:11.09192656 +0000 UTC m=+207.641584207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.650078 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-2zlqs"] Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.692850 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:10 crc kubenswrapper[4749]: E0320 07:16:10.693140 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:11.19311228 +0000 UTC m=+207.742770007 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.693209 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:10 crc kubenswrapper[4749]: E0320 07:16:10.693610 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:11.193595572 +0000 UTC m=+207.743253219 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.708540 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pwjp" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.715995 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-phw2k" Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.737900 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bsgpp"] Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.796672 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:10 crc kubenswrapper[4749]: E0320 07:16:10.796918 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:11.296894388 +0000 UTC m=+207.846552035 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.797161 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:10 crc kubenswrapper[4749]: E0320 07:16:10.797449 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:11.297437161 +0000 UTC m=+207.847094808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.870564 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-s629z"] Mar 20 07:16:10 crc kubenswrapper[4749]: W0320 07:16:10.892787 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff4ae6b4_eebc_4a32_b390_ec7ea70c8841.slice/crio-811b2e6891f9edbb93ed07eb48423a089213a55c90c90572e9b85dcb977db4ba WatchSource:0}: Error finding container 811b2e6891f9edbb93ed07eb48423a089213a55c90c90572e9b85dcb977db4ba: Status 404 returned error can't find the container with id 811b2e6891f9edbb93ed07eb48423a089213a55c90c90572e9b85dcb977db4ba Mar 20 07:16:10 crc kubenswrapper[4749]: I0320 07:16:10.898100 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:10 crc kubenswrapper[4749]: E0320 07:16:10.898553 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:11.398538731 +0000 UTC m=+207.948196378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.006358 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:11 crc kubenswrapper[4749]: E0320 07:16:11.007327 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:11.507299556 +0000 UTC m=+208.056957203 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.108638 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:11 crc kubenswrapper[4749]: E0320 07:16:11.109225 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:11.609205825 +0000 UTC m=+208.158863472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.210151 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.216297 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-8q7f7"] Mar 20 07:16:11 crc kubenswrapper[4749]: E0320 07:16:11.220504 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:11.720483126 +0000 UTC m=+208.270140773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.233375 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566516-6stbk"] Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.244650 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-vmnvn"] Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.252754 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2rp65" event={"ID":"b1b74013-b7b2-4e6b-a227-3161015b1d80","Type":"ContainerStarted","Data":"6dc2e5d89c8e912fdeb4a5e0d442488edb309712e35141b855f8c19582b37842"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.258468 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n2fsv"] Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.271617 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" event={"ID":"bd1003fd-4300-423c-b500-e782a8aeb7bb","Type":"ContainerStarted","Data":"55cdec982b6e3f07b62ae4151f558f756ec1a953fdaf158d72963526ee19cf5d"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.278586 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbrk6"] Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.282627 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j79zb"] Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.311002 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:11 crc kubenswrapper[4749]: E0320 07:16:11.311322 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:11.81130717 +0000 UTC m=+208.360964807 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.331836 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-9gg28" event={"ID":"d999d3d0-14e4-4759-98ab-a6d11011ca86","Type":"ContainerStarted","Data":"ba304bfafbd7d0586f6e55c039df7069accae09edec139d5300b6962e8a1855f"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.339910 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bsgpp" event={"ID":"a758ef11-ae4c-4d21-96b4-0a8bded670a3","Type":"ContainerStarted","Data":"df2a14dec9d4ef2ddaa2d77f3437a6e8245ad43a7595dcb901c30aec2afe95dd"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.343152 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lwsmc"] Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.344641 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-64rv4" event={"ID":"6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d","Type":"ContainerStarted","Data":"f80be68e65467af95c4bf4bb671e335d9d07ad619128b5d5f71e9e2c4dba9610"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.350897 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z5lbj" event={"ID":"815dbd4c-68ea-43e3-a355-1658ccdccd22","Type":"ContainerStarted","Data":"32194683dfb846a3f980df5bd79c084c7e462328bb89015061420c8c93dc9986"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.351631 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gd9xh"] Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.359195 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7f66l" event={"ID":"928b2eb3-aeb6-411f-b215-a33551894e85","Type":"ContainerStarted","Data":"fb32d47573de0e47ea95b5f57a3c04dc3e55b71b0e6cf167dcc030f28f0756fc"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.366973 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-lzpsv" event={"ID":"e2a05065-734d-4884-b037-c54ab87609eb","Type":"ContainerStarted","Data":"f0e81713b78b4d8605c85f44f158516c07f10999641aa1df36d4deb10f155416"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.368793 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nv7hv" event={"ID":"b32a129b-0c90-4d06-87d5-fd7e70b726e5","Type":"ContainerStarted","Data":"3e12ed54bf553cdf39a29719a7bf3dca28643193fcbafd77501689fe5be91e0f"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.368826 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nv7hv" event={"ID":"b32a129b-0c90-4d06-87d5-fd7e70b726e5","Type":"ContainerStarted","Data":"c45df2aba2fef1af0936d2ecbab38f72ac56b796faa9f1aefebd79e37acd04e3"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.374902 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" event={"ID":"e222e7a0-549c-46a7-8ee6-484dd2160be4","Type":"ContainerStarted","Data":"f801538862f77cbbe30d886ffbaa34f74745267a42a4f8ddd52e29dd0fc91138"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.379676 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-blksh" event={"ID":"954f5aa5-a05b-44bf-8642-e58746d21984","Type":"ContainerStarted","Data":"7422768bba786adb15807d358d61852876d9d781848b05e36276527ab3968a74"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.381680 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bjx8c" event={"ID":"a67741d8-5ced-477f-ac4c-0f7fc736f363","Type":"ContainerStarted","Data":"89266a7a500c914c7878e0d8c8f64ad8e4252462038c3aacedd88f9c676dcc9b"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.381726 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bjx8c" event={"ID":"a67741d8-5ced-477f-ac4c-0f7fc736f363","Type":"ContainerStarted","Data":"f0910180d3d54fbc63a587d1ac00ebb6cb102eeb8c1ae24bafdba10746ba5db9"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.387178 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5gk7" event={"ID":"309f1b8f-63a4-4019-b8f0-500dc7b60c8d","Type":"ContainerStarted","Data":"2e25618242707368c6842f50e4382e017bb5274a93f0b22f673288d961dd641c"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.387403 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5gk7" Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.389916 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9ncz" event={"ID":"657acde5-fe52-4eaa-812c-00914daf93ba","Type":"ContainerStarted","Data":"b5b75bff5195b24c701936081f84b62a8635c77267abe9a5467d7b4631e2a28e"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.395419 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-dbs7t" event={"ID":"da62d543-787a-4364-8271-8f8f9529dd0c","Type":"ContainerStarted","Data":"5791eefd8dd020f67c3cb471f94e9c5e599b0cd3796190b555b0915304ec117d"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.395454 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-dbs7t" event={"ID":"da62d543-787a-4364-8271-8f8f9529dd0c","Type":"ContainerStarted","Data":"87c31fd6db7881fdb79603bc06b94255badc64b4a971ed430146a5e2449b8e67"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.403947 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.412002 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:11 crc kubenswrapper[4749]: E0320 07:16:11.412354 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:11.912343178 +0000 UTC m=+208.462000825 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.418683 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gbwxf" event={"ID":"d1e24d59-bf58-421f-81a7-cc04d151fdd5","Type":"ContainerStarted","Data":"4e7cf4b13260f271311e5f314933a4376b0d8e8ae493c1412aeefc4dfd069bad"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.430899 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ldf7s" event={"ID":"2e8fb037-3e85-4c5a-a782-857cb17429af","Type":"ContainerStarted","Data":"bb0f5d2a8d0d91c078b6ffb870ec5c9c7b070c8bc6db23a90240e7d56b8e72c8"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.432920 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-952x2" event={"ID":"95e4555b-7f8b-4297-bed6-e0cf5e90ea3e","Type":"ContainerStarted","Data":"c0c1d448ba2126cdf6d9b5b6ef93eaa99e3012e82867707a7dac694de92bbf45"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.432950 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-952x2" event={"ID":"95e4555b-7f8b-4297-bed6-e0cf5e90ea3e","Type":"ContainerStarted","Data":"68a8a58d7cca16ca1db1e260a7e8d59b78dc8798f4f299dc266b3b8deb9242a3"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.433610 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-952x2" Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.437201 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-952x2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.437242 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-952x2" podUID="95e4555b-7f8b-4297-bed6-e0cf5e90ea3e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.452885 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bqpst" event={"ID":"214938de-72be-4416-84ab-c12591ef2c68","Type":"ContainerStarted","Data":"d738e7ec47ccf5a52baefc7a021d224b573ad4f9cecc59cb9a76df8b27f6a51f"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.453228 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-bqpst" Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.454626 4749 patch_prober.go:28] interesting pod/console-operator-58897d9998-bqpst container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.454676 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-bqpst" podUID="214938de-72be-4416-84ab-c12591ef2c68" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.472704 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" event={"ID":"38b3f23d-6db5-4788-bcd5-810450677cd6","Type":"ContainerStarted","Data":"9b4290553fefaf6bd1dc9d71cbd1a4957939dd415b4419fa033eb797173a8cb2"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.472921 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.475213 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6sv2z" event={"ID":"da30337a-26c0-4b0b-beb5-c46c48facfc6","Type":"ContainerStarted","Data":"791d6f58f302d9b0949d8a7590b836187b010622f7f3331648628368d838ac8e"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.475244 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6sv2z" event={"ID":"da30337a-26c0-4b0b-beb5-c46c48facfc6","Type":"ContainerStarted","Data":"dbc7965dad1fbaa14166c3f48ab6342821771c8a8bf2c79ecc17c5cf20513b3e"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.478680 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s629z" event={"ID":"b75a7366-ff7f-4176-80f2-687c82069d70","Type":"ContainerStarted","Data":"ed488a4a7b507b8fe1cd417fb746c850c01880d833705b2d4b47784f8ea9c9f5"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.483119 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hlw9w" event={"ID":"16e1bdb5-47b7-40e0-bc4b-cdd87976f461","Type":"ContainerStarted","Data":"00d476947825b5f8724c5f961407292746b9d96f33ff318220e019e98809730c"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.486949 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2zlqs" event={"ID":"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841","Type":"ContainerStarted","Data":"811b2e6891f9edbb93ed07eb48423a089213a55c90c90572e9b85dcb977db4ba"} Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.493221 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.513419 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:11 crc kubenswrapper[4749]: E0320 07:16:11.513575 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:12.013516098 +0000 UTC m=+208.563173745 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.516726 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:11 crc kubenswrapper[4749]: E0320 07:16:11.517787 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:12.017772877 +0000 UTC m=+208.567430524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.530524 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" podStartSLOduration=150.530504244 podStartE2EDuration="2m30.530504244s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:11.528829472 +0000 UTC m=+208.078487129" watchObservedRunningTime="2026-03-20 07:16:11.530504244 +0000 UTC m=+208.080161891" Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.550726 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jfhjj"] Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.617503 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:11 crc kubenswrapper[4749]: E0320 07:16:11.618626 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:12.118600709 +0000 UTC m=+208.668258416 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:11 crc kubenswrapper[4749]: W0320 07:16:11.650677 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeac86025_7f7e_49a8_ac1b_4bf8c1a65c35.slice/crio-fd2dad77326d20c07be6d027daf62aa7877d7eaadda359dbe47eb81be39ddab8 WatchSource:0}: Error finding container fd2dad77326d20c07be6d027daf62aa7877d7eaadda359dbe47eb81be39ddab8: Status 404 returned error can't find the container with id fd2dad77326d20c07be6d027daf62aa7877d7eaadda359dbe47eb81be39ddab8 Mar 20 07:16:11 crc kubenswrapper[4749]: W0320 07:16:11.711141 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f707ea9_be11_4354_99a3_e439dd4e6173.slice/crio-76a481a91e796c2d26a63ae6b54566e1b4251ec258335089ca43bfb8b6e14a17 WatchSource:0}: Error finding container 76a481a91e796c2d26a63ae6b54566e1b4251ec258335089ca43bfb8b6e14a17: Status 404 returned error can't find the container with id 76a481a91e796c2d26a63ae6b54566e1b4251ec258335089ca43bfb8b6e14a17 Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.720505 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:11 crc kubenswrapper[4749]: E0320 07:16:11.741385 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:12.241367575 +0000 UTC m=+208.791025222 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.839766 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:11 crc kubenswrapper[4749]: E0320 07:16:11.840111 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:12.340096742 +0000 UTC m=+208.889754389 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.845193 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jbqrm"] Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.879240 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t8g8d"] Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.899680 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pwjp"] Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.928830 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-l285h"] Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.940831 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.940909 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566515-6hw5x"] Mar 20 07:16:11 crc kubenswrapper[4749]: E0320 07:16:11.941170 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:12.441154839 +0000 UTC m=+208.990812486 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.949686 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvdhs"] Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.952096 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-49gfv"] Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.956478 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-dbs7t" Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.964621 4749 patch_prober.go:28] interesting pod/router-default-5444994796-dbs7t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 07:16:11 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 20 07:16:11 crc kubenswrapper[4749]: [+]process-running ok Mar 20 07:16:11 crc kubenswrapper[4749]: healthz check failed Mar 20 07:16:11 crc kubenswrapper[4749]: I0320 07:16:11.964657 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dbs7t" podUID="da62d543-787a-4364-8271-8f8f9529dd0c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.016616 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.041756 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:12 crc kubenswrapper[4749]: E0320 07:16:12.042197 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:12.542178846 +0000 UTC m=+209.091836493 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.053727 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-952x2" podStartSLOduration=151.053704023 podStartE2EDuration="2m31.053704023s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:12.04310152 +0000 UTC m=+208.592759167" watchObservedRunningTime="2026-03-20 07:16:12.053704023 +0000 UTC m=+208.603361690" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.060666 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-phw2k"] Mar 20 07:16:12 crc kubenswrapper[4749]: W0320 07:16:12.109070 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0121f83c_494b_40f1_9a70_65344ed716ad.slice/crio-f807a8595fcbcee26e1b6003ccd9ecfad25a9ce622543c02ba607b43e94d1b6e WatchSource:0}: Error finding container f807a8595fcbcee26e1b6003ccd9ecfad25a9ce622543c02ba607b43e94d1b6e: Status 404 returned error can't find the container with id f807a8595fcbcee26e1b6003ccd9ecfad25a9ce622543c02ba607b43e94d1b6e Mar 20 07:16:12 crc kubenswrapper[4749]: W0320 07:16:12.132301 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe9ae029_fa6d_44b8_9e03_af525859dd09.slice/crio-0aa3da459965a32a9cc7e7931e544cf06b06a76d926e4bb3d8e0cbb81a124670 WatchSource:0}: Error finding container 0aa3da459965a32a9cc7e7931e544cf06b06a76d926e4bb3d8e0cbb81a124670: Status 404 returned error can't find the container with id 0aa3da459965a32a9cc7e7931e544cf06b06a76d926e4bb3d8e0cbb81a124670 Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.143114 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.143164 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.143224 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:16:12 crc kubenswrapper[4749]: E0320 07:16:12.143925 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:12.643913671 +0000 UTC m=+209.193571318 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.144135 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.156966 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:16:12 crc kubenswrapper[4749]: W0320 07:16:12.157964 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2af8695f_a945_411d_ac95_03191fb3080d.slice/crio-bb4595cd35635888e5c9123be97d5d56c7b048c9b161f766c504cd64942184ed WatchSource:0}: Error finding container bb4595cd35635888e5c9123be97d5d56c7b048c9b161f766c504cd64942184ed: Status 404 returned error can't find the container with id bb4595cd35635888e5c9123be97d5d56c7b048c9b161f766c504cd64942184ed Mar 20 07:16:12 crc kubenswrapper[4749]: W0320 07:16:12.160618 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd842a173_2088_46ee_bdd3_4f058a2c62e8.slice/crio-96ba03724c913b7c345cd1e95cc91bf4b980a5b4abad0b8d0d229c1c228c451c WatchSource:0}: Error finding container 96ba03724c913b7c345cd1e95cc91bf4b980a5b4abad0b8d0d229c1c228c451c: Status 404 returned error can't find the container with id 96ba03724c913b7c345cd1e95cc91bf4b980a5b4abad0b8d0d229c1c228c451c Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.170839 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-dbs7t" podStartSLOduration=151.170825943 podStartE2EDuration="2m31.170825943s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:12.170338731 +0000 UTC m=+208.719996378" watchObservedRunningTime="2026-03-20 07:16:12.170825943 +0000 UTC m=+208.720483590" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.172669 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z5lbj" podStartSLOduration=151.172661241 podStartE2EDuration="2m31.172661241s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:12.123860976 +0000 UTC m=+208.673518643" watchObservedRunningTime="2026-03-20 07:16:12.172661241 +0000 UTC m=+208.722318888" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.204424 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.205772 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6sv2z" podStartSLOduration=151.205758131 podStartE2EDuration="2m31.205758131s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:12.204975871 +0000 UTC m=+208.754633518" watchObservedRunningTime="2026-03-20 07:16:12.205758131 +0000 UTC m=+208.755415768" Mar 20 07:16:12 crc kubenswrapper[4749]: W0320 07:16:12.219268 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecc1f279_eced_4b51_8ded_b7d00d089722.slice/crio-e471b2974988b04b2dcb00975625471e1941f4c0a86f656ef5d399f93767acf0 WatchSource:0}: Error finding container e471b2974988b04b2dcb00975625471e1941f4c0a86f656ef5d399f93767acf0: Status 404 returned error can't find the container with id e471b2974988b04b2dcb00975625471e1941f4c0a86f656ef5d399f93767acf0 Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.250843 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.251053 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.251101 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.251132 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs\") pod \"network-metrics-daemon-k56zh\" (UID: \"6d19b89e-d048-4656-b5ce-c637190ab678\") " pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:16:12 crc kubenswrapper[4749]: E0320 07:16:12.253260 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:12.753231051 +0000 UTC m=+209.302888698 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.258601 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d19b89e-d048-4656-b5ce-c637190ab678-metrics-certs\") pod \"network-metrics-daemon-k56zh\" (UID: \"6d19b89e-d048-4656-b5ce-c637190ab678\") " pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.260984 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" podStartSLOduration=152.26096763 podStartE2EDuration="2m32.26096763s" podCreationTimestamp="2026-03-20 07:13:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:12.258936228 +0000 UTC m=+208.808593875" watchObservedRunningTime="2026-03-20 07:16:12.26096763 +0000 UTC m=+208.810625277" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.270776 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.276035 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.298204 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-bjx8c" podStartSLOduration=152.298186746 podStartE2EDuration="2m32.298186746s" podCreationTimestamp="2026-03-20 07:13:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:12.2947937 +0000 UTC m=+208.844451347" watchObservedRunningTime="2026-03-20 07:16:12.298186746 +0000 UTC m=+208.847844393" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.352257 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:12 crc kubenswrapper[4749]: E0320 07:16:12.352668 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:12.852654207 +0000 UTC m=+209.402311854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.373850 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gbwxf" podStartSLOduration=152.373828891 podStartE2EDuration="2m32.373828891s" podCreationTimestamp="2026-03-20 07:13:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:12.330957309 +0000 UTC m=+208.880614946" watchObservedRunningTime="2026-03-20 07:16:12.373828891 +0000 UTC m=+208.923486538" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.418267 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5gk7" podStartSLOduration=151.418246893 podStartE2EDuration="2m31.418246893s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:12.371684296 +0000 UTC m=+208.921341943" watchObservedRunningTime="2026-03-20 07:16:12.418246893 +0000 UTC m=+208.967904540" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.455733 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:12 crc kubenswrapper[4749]: E0320 07:16:12.456130 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:12.956114476 +0000 UTC m=+209.505772123 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.505084 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.505372 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-bqpst" podStartSLOduration=151.505353301 podStartE2EDuration="2m31.505353301s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:12.456074525 +0000 UTC m=+209.005732172" watchObservedRunningTime="2026-03-20 07:16:12.505353301 +0000 UTC m=+209.055010948" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.519595 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k56zh" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.522199 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.547571 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" event={"ID":"00730545-e9b7-4166-9f09-7a6fcac8cad3","Type":"ContainerStarted","Data":"67cb77d5c367914c438c14e7a50a60f7b25514b9f084346fde48e6b9dcbfb6c7"} Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.547613 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" event={"ID":"00730545-e9b7-4166-9f09-7a6fcac8cad3","Type":"ContainerStarted","Data":"96c0d11937cd1764d96d49906a193ace9af75d78ad745e96703b8178636c1e54"} Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.552903 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.561430 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:12 crc kubenswrapper[4749]: E0320 07:16:12.561766 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:13.061755162 +0000 UTC m=+209.611412809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.564511 4749 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-j79zb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/healthz\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.564579 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" podUID="00730545-e9b7-4166-9f09-7a6fcac8cad3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.17:8080/healthz\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.577051 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" podStartSLOduration=151.577031574 podStartE2EDuration="2m31.577031574s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:12.574752476 +0000 UTC m=+209.124410123" watchObservedRunningTime="2026-03-20 07:16:12.577031574 +0000 UTC m=+209.126689221" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.586819 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" event={"ID":"bd1003fd-4300-423c-b500-e782a8aeb7bb","Type":"ContainerStarted","Data":"fbbcf5253dbefa7a01d04ea62ce3369e448d83bba300dfaf0b5689eacc8c9959"} Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.587967 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.597531 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-phw2k" event={"ID":"ecc1f279-eced-4b51-8ded-b7d00d089722","Type":"ContainerStarted","Data":"e471b2974988b04b2dcb00975625471e1941f4c0a86f656ef5d399f93767acf0"} Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.618493 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbrk6" event={"ID":"dce27735-9fe1-49f2-a05a-c042a4b6db32","Type":"ContainerStarted","Data":"63f7338d103a728aa0eaf918ee79ffb11844926657805f2aff39a8c66d7f4e2f"} Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.618548 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbrk6" event={"ID":"dce27735-9fe1-49f2-a05a-c042a4b6db32","Type":"ContainerStarted","Data":"9dca312edec80f7e5347c46c0ae8b005deddbc90a352998f2af897a5e398a176"} Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.630068 4749 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-hvf29 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.630116 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" podUID="bd1003fd-4300-423c-b500-e782a8aeb7bb" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.632302 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bqpst" event={"ID":"214938de-72be-4416-84ab-c12591ef2c68","Type":"ContainerStarted","Data":"b9fe4f543695faf971e599b2edf4321cab734b69170a990fcb437405633a6c5d"} Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.633216 4749 patch_prober.go:28] interesting pod/console-operator-58897d9998-bqpst container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.633256 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-bqpst" podUID="214938de-72be-4416-84ab-c12591ef2c68" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.649472 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n2fsv" event={"ID":"88f1260c-ea7b-4282-b1af-a7f738cf40b9","Type":"ContainerStarted","Data":"847ecb5ea5e272e1b7890879d1dd709b15a40258aa7b5a37f275e3fe1f5ab65a"} Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.649517 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n2fsv" event={"ID":"88f1260c-ea7b-4282-b1af-a7f738cf40b9","Type":"ContainerStarted","Data":"fe39ad72e9e8d29c1fe0e29c0da066241953aa90ab4204a817f6dffe39abb2fb"} Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.650177 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" podStartSLOduration=151.650166304 podStartE2EDuration="2m31.650166304s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:12.614847686 +0000 UTC m=+209.164505323" watchObservedRunningTime="2026-03-20 07:16:12.650166304 +0000 UTC m=+209.199823951" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.650526 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n2fsv" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.652369 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbrk6" podStartSLOduration=151.65236001 podStartE2EDuration="2m31.65236001s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:12.639298645 +0000 UTC m=+209.188956292" watchObservedRunningTime="2026-03-20 07:16:12.65236001 +0000 UTC m=+209.202017657" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.659662 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-6hw5x" event={"ID":"2af8695f-a945-411d-ac95-03191fb3080d","Type":"ContainerStarted","Data":"bb4595cd35635888e5c9123be97d5d56c7b048c9b161f766c504cd64942184ed"} Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.664961 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:12 crc kubenswrapper[4749]: E0320 07:16:12.666122 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:13.166106254 +0000 UTC m=+209.715763901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.686923 4749 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-n2fsv container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.687014 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gd9xh" event={"ID":"1f707ea9-be11-4354-99a3-e439dd4e6173","Type":"ContainerStarted","Data":"adf599288c86148c5f634061bb9d9f46df6348478e3e7b7badfab9d1d3b777a7"} Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.687195 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n2fsv" podUID="88f1260c-ea7b-4282-b1af-a7f738cf40b9" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.687274 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gd9xh" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.687325 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gd9xh" event={"ID":"1f707ea9-be11-4354-99a3-e439dd4e6173","Type":"ContainerStarted","Data":"76a481a91e796c2d26a63ae6b54566e1b4251ec258335089ca43bfb8b6e14a17"} Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.707672 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n2fsv" podStartSLOduration=151.707654432 podStartE2EDuration="2m31.707654432s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:12.698825414 +0000 UTC m=+209.248483051" watchObservedRunningTime="2026-03-20 07:16:12.707654432 +0000 UTC m=+209.257312069" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.735967 4749 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-gd9xh container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" start-of-body= Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.736023 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gd9xh" podUID="1f707ea9-be11-4354-99a3-e439dd4e6173" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.746403 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gd9xh" podStartSLOduration=151.746382277 podStartE2EDuration="2m31.746382277s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:12.740958208 +0000 UTC m=+209.290615865" watchObservedRunningTime="2026-03-20 07:16:12.746382277 +0000 UTC m=+209.296039924" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.766963 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-blksh" event={"ID":"954f5aa5-a05b-44bf-8642-e58746d21984","Type":"ContainerStarted","Data":"27ffae3e4cb4161eb1017e08efa2834ffa05396f797d21e17c2e1b49703ba0f4"} Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.768224 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:12 crc kubenswrapper[4749]: E0320 07:16:12.769893 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:13.269878931 +0000 UTC m=+209.819536578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.787617 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-64rv4" event={"ID":"6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d","Type":"ContainerStarted","Data":"98bff21fba435ec951a3ec4c3da429db6ee6f10f363af58cc817274d947172ef"} Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.802102 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-blksh" podStartSLOduration=151.802086759 podStartE2EDuration="2m31.802086759s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:12.801539225 +0000 UTC m=+209.351196882" watchObservedRunningTime="2026-03-20 07:16:12.802086759 +0000 UTC m=+209.351744406" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.802472 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" event={"ID":"16b05ee8-fdde-4f11-936e-0982042ccfcf","Type":"ContainerStarted","Data":"dea9d9041987b653e47e25a2923f49b7471d91476b4e33e4403c8c6f9166bdb0"} Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.828547 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566516-6stbk" event={"ID":"da95cd86-f90a-4d7f-a308-4124b22d8427","Type":"ContainerStarted","Data":"74944d5ca2ebb3420955970757f2337dd3fbfccd0cffb9aa2b4076c3cffc8f05"} Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.873222 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:12 crc kubenswrapper[4749]: E0320 07:16:12.874100 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:13.37408257 +0000 UTC m=+209.923740217 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.968454 4749 patch_prober.go:28] interesting pod/router-default-5444994796-dbs7t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 07:16:12 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 20 07:16:12 crc kubenswrapper[4749]: [+]process-running ok Mar 20 07:16:12 crc kubenswrapper[4749]: healthz check failed Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.968505 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dbs7t" podUID="da62d543-787a-4364-8271-8f8f9529dd0c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.970871 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-9gg28" event={"ID":"d999d3d0-14e4-4759-98ab-a6d11011ca86","Type":"ContainerStarted","Data":"78520b8f24bee7c76296dd5d99abded0aa5e229a0f8fce4b5ae4e710ae7f5b3b"} Mar 20 07:16:12 crc kubenswrapper[4749]: I0320 07:16:12.981971 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:12 crc kubenswrapper[4749]: E0320 07:16:12.982246 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:13.482234659 +0000 UTC m=+210.031892306 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.008491 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lwsmc" event={"ID":"c03762c2-c2af-4472-abb5-5017f75e738f","Type":"ContainerStarted","Data":"279dbf289e7fdf4fb594992cebb5285b22d89d9570ed7822ba4f5c8d778411c2"} Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.017262 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s629z" event={"ID":"b75a7366-ff7f-4176-80f2-687c82069d70","Type":"ContainerStarted","Data":"078daa389f1d05223b1abce0e4a188bedbd532a29e607883d967742738108a64"} Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.020372 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9ncz" event={"ID":"657acde5-fe52-4eaa-812c-00914daf93ba","Type":"ContainerStarted","Data":"a6457cff8f1fe10eab418d5b05357342c8b52248d18c69fe03ee8c9296aefc3f"} Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.056783 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2rp65" event={"ID":"b1b74013-b7b2-4e6b-a227-3161015b1d80","Type":"ContainerStarted","Data":"2a8b3905dec816d400d4b684180b2c9e4068b5fa5834f08b75a75565c8a6198e"} Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.056829 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2rp65" event={"ID":"b1b74013-b7b2-4e6b-a227-3161015b1d80","Type":"ContainerStarted","Data":"eeba5afcbcf73a60ccd0cea042c2533ca794d27b4768eb3262efc41609fae1af"} Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.083672 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:13 crc kubenswrapper[4749]: E0320 07:16:13.084993 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:13.5849733 +0000 UTC m=+210.134630947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.143662 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-vmnvn" event={"ID":"6799ceeb-28d4-4caf-97e4-e9115baae071","Type":"ContainerStarted","Data":"bebb0b53656143060babe683ac72432cd0dc3207246c0c4500daf1631bdee893"} Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.165033 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-9gg28" podStartSLOduration=153.165015397 podStartE2EDuration="2m33.165015397s" podCreationTimestamp="2026-03-20 07:13:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:13.091777235 +0000 UTC m=+209.641434882" watchObservedRunningTime="2026-03-20 07:16:13.165015397 +0000 UTC m=+209.714673044" Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.165861 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c9ncz" podStartSLOduration=152.165855109 podStartE2EDuration="2m32.165855109s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:13.142714544 +0000 UTC m=+209.692372191" watchObservedRunningTime="2026-03-20 07:16:13.165855109 +0000 UTC m=+209.715512756" Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.184927 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:13 crc kubenswrapper[4749]: E0320 07:16:13.186240 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:13.686229183 +0000 UTC m=+210.235886820 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.225028 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pwjp" event={"ID":"0121f83c-494b-40f1-9a70-65344ed716ad","Type":"ContainerStarted","Data":"f807a8595fcbcee26e1b6003ccd9ecfad25a9ce622543c02ba607b43e94d1b6e"} Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.251771 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2rp65" podStartSLOduration=152.251755917 podStartE2EDuration="2m32.251755917s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:13.250622818 +0000 UTC m=+209.800280465" watchObservedRunningTime="2026-03-20 07:16:13.251755917 +0000 UTC m=+209.801413564" Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.289258 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:13 crc kubenswrapper[4749]: E0320 07:16:13.289967 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:13.789933509 +0000 UTC m=+210.339591156 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.312754 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gbwxf" event={"ID":"d1e24d59-bf58-421f-81a7-cc04d151fdd5","Type":"ContainerStarted","Data":"05072f96076a2f3974ea60b88fe1b2008c23a9a8c45eaa738383a0b2b831227f"} Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.331158 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ldf7s" event={"ID":"2e8fb037-3e85-4c5a-a782-857cb17429af","Type":"ContainerStarted","Data":"5f2e6586fb0e4902dcd58e3fd8f4b4d2a75f27f72be9fd8b07f420defeedfdbd"} Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.385501 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hlw9w" event={"ID":"16e1bdb5-47b7-40e0-bc4b-cdd87976f461","Type":"ContainerStarted","Data":"91bbe4c313371b641b291721cb82d9dce091b063db43e78bd107940c03763964"} Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.392484 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:13 crc kubenswrapper[4749]: E0320 07:16:13.396365 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:13.896345654 +0000 UTC m=+210.446003301 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.425967 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7f66l" event={"ID":"928b2eb3-aeb6-411f-b215-a33551894e85","Type":"ContainerStarted","Data":"f1e00cc41d5a8bb6b78c5096799b6ade9d8eda62942991f4e9b02a8a8bdd9d2e"} Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.455465 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ldf7s" podStartSLOduration=152.455447043 podStartE2EDuration="2m32.455447043s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:13.37247282 +0000 UTC m=+209.922130467" watchObservedRunningTime="2026-03-20 07:16:13.455447043 +0000 UTC m=+210.005104690" Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.482520 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-lzpsv" event={"ID":"e2a05065-734d-4884-b037-c54ab87609eb","Type":"ContainerStarted","Data":"36a9fe5cf12caaf834d415a731f40bc20ede125636fd6c43a6e0f9019aff0dd1"} Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.494257 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:13 crc kubenswrapper[4749]: E0320 07:16:13.494414 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:13.994390644 +0000 UTC m=+210.544048291 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.494509 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:13 crc kubenswrapper[4749]: E0320 07:16:13.495698 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:13.995687138 +0000 UTC m=+210.545344785 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.506174 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-7f66l" podStartSLOduration=6.506158666 podStartE2EDuration="6.506158666s" podCreationTimestamp="2026-03-20 07:16:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:13.463937311 +0000 UTC m=+210.013594958" watchObservedRunningTime="2026-03-20 07:16:13.506158666 +0000 UTC m=+210.055816313" Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.507819 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bsgpp" podStartSLOduration=153.507812089 podStartE2EDuration="2m33.507812089s" podCreationTimestamp="2026-03-20 07:13:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:13.505660154 +0000 UTC m=+210.055317801" watchObservedRunningTime="2026-03-20 07:16:13.507812089 +0000 UTC m=+210.057469726" Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.509749 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvdhs" event={"ID":"6b71bdcd-f324-489c-a3ae-61ac7648b36a","Type":"ContainerStarted","Data":"3d7b18fee5ea1cb59473a57ebd6739778464c00de7d134e7d055412915a54e27"} Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.515912 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvdhs" Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.546679 4749 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-hvdhs container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.546790 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvdhs" podUID="6b71bdcd-f324-489c-a3ae-61ac7648b36a" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.547255 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-jfhjj" event={"ID":"eac86025-7f7e-49a8-ac1b-4bf8c1a65c35","Type":"ContainerStarted","Data":"519728251b0d522b0be9ecbc39d33b76b43af850b98236bd8a86cfee7b2d1917"} Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.547305 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-jfhjj" event={"ID":"eac86025-7f7e-49a8-ac1b-4bf8c1a65c35","Type":"ContainerStarted","Data":"fd2dad77326d20c07be6d027daf62aa7877d7eaadda359dbe47eb81be39ddab8"} Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.572187 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvdhs" podStartSLOduration=152.572173743 podStartE2EDuration="2m32.572173743s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:13.569681739 +0000 UTC m=+210.119339386" watchObservedRunningTime="2026-03-20 07:16:13.572173743 +0000 UTC m=+210.121831390" Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.597366 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:13 crc kubenswrapper[4749]: E0320 07:16:13.598600 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:14.098582352 +0000 UTC m=+210.648239999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.618484 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-jfhjj" podStartSLOduration=152.618468103 podStartE2EDuration="2m32.618468103s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:13.616575724 +0000 UTC m=+210.166233371" watchObservedRunningTime="2026-03-20 07:16:13.618468103 +0000 UTC m=+210.168125750" Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.667905 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-49gfv" event={"ID":"d842a173-2088-46ee-bdd3-4f058a2c62e8","Type":"ContainerStarted","Data":"96ba03724c913b7c345cd1e95cc91bf4b980a5b4abad0b8d0d229c1c228c451c"} Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.698910 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:13 crc kubenswrapper[4749]: E0320 07:16:13.699545 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:14.199532637 +0000 UTC m=+210.749190284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.705069 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-49gfv" podStartSLOduration=6.704995197 podStartE2EDuration="6.704995197s" podCreationTimestamp="2026-03-20 07:16:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:13.688904544 +0000 UTC m=+210.238562191" watchObservedRunningTime="2026-03-20 07:16:13.704995197 +0000 UTC m=+210.254652854" Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.716897 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2zlqs" event={"ID":"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841","Type":"ContainerStarted","Data":"bd50e4fa543b07ebf9d0f5e3db4b3c9481c8d248ce432c831dcae0628425657d"} Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.743652 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nv7hv" event={"ID":"b32a129b-0c90-4d06-87d5-fd7e70b726e5","Type":"ContainerStarted","Data":"2be98877ca09161a9a1583d2c5410ef6613547a234575e514bbef3e2b06a7200"} Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.791404 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-2zlqs" podStartSLOduration=152.791389848 podStartE2EDuration="2m32.791389848s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:13.752995091 +0000 UTC m=+210.302652738" watchObservedRunningTime="2026-03-20 07:16:13.791389848 +0000 UTC m=+210.341047495" Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.800040 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:13 crc kubenswrapper[4749]: E0320 07:16:13.803446 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:14.303407396 +0000 UTC m=+210.853065043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.826576 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-l285h" event={"ID":"be9ae029-fa6d-44b8-9e03-af525859dd09","Type":"ContainerStarted","Data":"0aa3da459965a32a9cc7e7931e544cf06b06a76d926e4bb3d8e0cbb81a124670"} Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.837223 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nv7hv" podStartSLOduration=152.837206696 podStartE2EDuration="2m32.837206696s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:13.792505357 +0000 UTC m=+210.342163004" watchObservedRunningTime="2026-03-20 07:16:13.837206696 +0000 UTC m=+210.386864343" Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.844835 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-k56zh"] Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.849253 4749 generic.go:334] "Generic (PLEG): container finished" podID="e222e7a0-549c-46a7-8ee6-484dd2160be4" containerID="7a412300047860f6c9c1db44e89f1d1d4f0a20fcdde9ea329bd947e4dba469ff" exitCode=0 Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.849494 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" event={"ID":"e222e7a0-549c-46a7-8ee6-484dd2160be4","Type":"ContainerDied","Data":"7a412300047860f6c9c1db44e89f1d1d4f0a20fcdde9ea329bd947e4dba469ff"} Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.852941 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t8g8d" event={"ID":"d441824b-dc11-4f89-af28-0a5c76439296","Type":"ContainerStarted","Data":"ec3b9c5b26b949504731c6f4493477acf1493f7f410b91af24e03d851fe39688"} Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.878548 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jbqrm" event={"ID":"16c3a232-5504-4648-a65b-2a0d89126e22","Type":"ContainerStarted","Data":"7433737dac8aba891f348ec95f4602715afd38449ac1bb4505c418044f387ebd"} Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.879709 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-952x2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.879734 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-952x2" podUID="95e4555b-7f8b-4297-bed6-e0cf5e90ea3e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.880829 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-l285h" podStartSLOduration=152.880815997 podStartE2EDuration="2m32.880815997s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:13.86850875 +0000 UTC m=+210.418166397" watchObservedRunningTime="2026-03-20 07:16:13.880815997 +0000 UTC m=+210.430473644" Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.910861 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" podStartSLOduration=152.910846828 podStartE2EDuration="2m32.910846828s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:13.909924924 +0000 UTC m=+210.459582581" watchObservedRunningTime="2026-03-20 07:16:13.910846828 +0000 UTC m=+210.460504475" Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.911363 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-q5gk7" Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.914107 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:13 crc kubenswrapper[4749]: E0320 07:16:13.922767 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:14.422751345 +0000 UTC m=+210.972409092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.959950 4749 patch_prober.go:28] interesting pod/router-default-5444994796-dbs7t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 07:16:13 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 20 07:16:13 crc kubenswrapper[4749]: [+]process-running ok Mar 20 07:16:13 crc kubenswrapper[4749]: healthz check failed Mar 20 07:16:13 crc kubenswrapper[4749]: I0320 07:16:13.959995 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dbs7t" podUID="da62d543-787a-4364-8271-8f8f9529dd0c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.015513 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:14 crc kubenswrapper[4749]: E0320 07:16:14.015801 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:14.515774645 +0000 UTC m=+211.065432292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.015846 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:14 crc kubenswrapper[4749]: E0320 07:16:14.016180 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:14.516174095 +0000 UTC m=+211.065831742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.035792 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jbqrm" podStartSLOduration=153.035774999 podStartE2EDuration="2m33.035774999s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:13.983481175 +0000 UTC m=+210.533138822" watchObservedRunningTime="2026-03-20 07:16:14.035774999 +0000 UTC m=+210.585432646" Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.117997 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:14 crc kubenswrapper[4749]: E0320 07:16:14.118353 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:14.618339641 +0000 UTC m=+211.167997288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.164992 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.165322 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.185708 4749 patch_prober.go:28] interesting pod/apiserver-76f77b778f-9gg28 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 20 07:16:14 crc kubenswrapper[4749]: [+]log ok Mar 20 07:16:14 crc kubenswrapper[4749]: [+]etcd ok Mar 20 07:16:14 crc kubenswrapper[4749]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 20 07:16:14 crc kubenswrapper[4749]: [+]poststarthook/generic-apiserver-start-informers ok Mar 20 07:16:14 crc kubenswrapper[4749]: [+]poststarthook/max-in-flight-filter ok Mar 20 07:16:14 crc kubenswrapper[4749]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 20 07:16:14 crc kubenswrapper[4749]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 20 07:16:14 crc kubenswrapper[4749]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 20 07:16:14 crc kubenswrapper[4749]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 20 07:16:14 crc kubenswrapper[4749]: [+]poststarthook/project.openshift.io-projectcache ok Mar 20 07:16:14 crc kubenswrapper[4749]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 20 07:16:14 crc kubenswrapper[4749]: [+]poststarthook/openshift.io-startinformers ok Mar 20 07:16:14 crc kubenswrapper[4749]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 20 07:16:14 crc kubenswrapper[4749]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 20 07:16:14 crc kubenswrapper[4749]: livez check failed Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.185756 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-9gg28" podUID="d999d3d0-14e4-4759-98ab-a6d11011ca86" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.223170 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:14 crc kubenswrapper[4749]: E0320 07:16:14.223544 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:14.723532705 +0000 UTC m=+211.273190352 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.329463 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:14 crc kubenswrapper[4749]: E0320 07:16:14.329957 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:14.82994269 +0000 UTC m=+211.379600337 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.433951 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:14 crc kubenswrapper[4749]: E0320 07:16:14.434255 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:14.934242262 +0000 UTC m=+211.483899909 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.504256 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.504479 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.536331 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:14 crc kubenswrapper[4749]: E0320 07:16:14.536543 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:15.036527331 +0000 UTC m=+211.586184968 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.637049 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:14 crc kubenswrapper[4749]: E0320 07:16:14.637348 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:15.137337072 +0000 UTC m=+211.686994719 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.737928 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:14 crc kubenswrapper[4749]: E0320 07:16:14.738135 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:15.238112112 +0000 UTC m=+211.787769759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.738399 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:14 crc kubenswrapper[4749]: E0320 07:16:14.738687 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:15.238674586 +0000 UTC m=+211.788332233 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.839369 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:14 crc kubenswrapper[4749]: E0320 07:16:14.839569 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:15.339536359 +0000 UTC m=+211.889194006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.896310 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-49gfv" event={"ID":"d842a173-2088-46ee-bdd3-4f058a2c62e8","Type":"ContainerStarted","Data":"c5970becdc453817ace40446b7c01d82d7177407c64c800353eb8ca062e2ed94"} Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.913667 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jbqrm" event={"ID":"16c3a232-5504-4648-a65b-2a0d89126e22","Type":"ContainerStarted","Data":"49d95868cf637c34d07072128ddd61bc7dfa9136252e68c1afda329e4760b3d9"} Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.913716 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jbqrm" event={"ID":"16c3a232-5504-4648-a65b-2a0d89126e22","Type":"ContainerStarted","Data":"a52dee94ba1ebcedb937ad4a5404600c0653e59d37d8097fffdf05ae4ccc97a2"} Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.916122 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-6hw5x" event={"ID":"2af8695f-a945-411d-ac95-03191fb3080d","Type":"ContainerStarted","Data":"04c94989401b3e171aa790f44e5e33a1bb2df54d0c87db557e5ced47d88d57d8"} Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.923410 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bsgpp" event={"ID":"a758ef11-ae4c-4d21-96b4-0a8bded670a3","Type":"ContainerStarted","Data":"6e86474b584625e4f5ce5d98c237a38584538e7eff3bebd0b91068582270e8e1"} Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.924993 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"944b9b09c36bf0369d8dc9e75118a29c9bd73848a05d75274733235432e233f2"} Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.928788 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s629z" event={"ID":"b75a7366-ff7f-4176-80f2-687c82069d70","Type":"ContainerStarted","Data":"810f5e14ec29e1b71cf134aeb319b8985e3bd3b9ff44b82c5fae8d880624124d"} Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.932512 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-6hw5x" podStartSLOduration=74.932497189 podStartE2EDuration="1m14.932497189s" podCreationTimestamp="2026-03-20 07:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:14.931704118 +0000 UTC m=+211.481361765" watchObservedRunningTime="2026-03-20 07:16:14.932497189 +0000 UTC m=+211.482154836" Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.939702 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-phw2k" event={"ID":"ecc1f279-eced-4b51-8ded-b7d00d089722","Type":"ContainerStarted","Data":"fe7b6b2c05fd84b2a32b9d123ba4bfa4a9636926619bff37be9f82d8ed58487d"} Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.939744 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-phw2k" event={"ID":"ecc1f279-eced-4b51-8ded-b7d00d089722","Type":"ContainerStarted","Data":"1f052fa03930830fae00d5a63d6d9e7d68d4831f650c4de9fb03f440097f0962"} Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.940069 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:14 crc kubenswrapper[4749]: E0320 07:16:14.942425 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:15.442410064 +0000 UTC m=+211.992067711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.948618 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" event={"ID":"e222e7a0-549c-46a7-8ee6-484dd2160be4","Type":"ContainerStarted","Data":"4d2ee0c061e201f98680b721736dbffe0f7eee9ba847a31a88253483a89a4af7"} Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.967007 4749 patch_prober.go:28] interesting pod/router-default-5444994796-dbs7t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 07:16:14 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 20 07:16:14 crc kubenswrapper[4749]: [+]process-running ok Mar 20 07:16:14 crc kubenswrapper[4749]: healthz check failed Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.967060 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dbs7t" podUID="da62d543-787a-4364-8271-8f8f9529dd0c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.972067 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hlw9w" event={"ID":"16e1bdb5-47b7-40e0-bc4b-cdd87976f461","Type":"ContainerStarted","Data":"ab7f6ef63ce03db631dd7e3c009dcdb86b8c704e4b3115117f7ae5e4c8073807"} Mar 20 07:16:14 crc kubenswrapper[4749]: I0320 07:16:14.976178 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s629z" podStartSLOduration=153.976164781 podStartE2EDuration="2m33.976164781s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:14.972744584 +0000 UTC m=+211.522402231" watchObservedRunningTime="2026-03-20 07:16:14.976164781 +0000 UTC m=+211.525822428" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.029462 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k56zh" event={"ID":"6d19b89e-d048-4656-b5ce-c637190ab678","Type":"ContainerStarted","Data":"d5e4c6c70baf65c864676eed1f98cc78cbcf8d973502fb58eb532adec5d56674"} Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.034676 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t8g8d" event={"ID":"d441824b-dc11-4f89-af28-0a5c76439296","Type":"ContainerStarted","Data":"bd611677127ffa2efd80b49ff76385750cf0779918b0bd00f27037dae3a046dd"} Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.034717 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t8g8d" event={"ID":"d441824b-dc11-4f89-af28-0a5c76439296","Type":"ContainerStarted","Data":"a562ceecc5a411780817db04b6ffafb120f0ae4439eae6bbcd2ebd3e83d62af3"} Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.035304 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-t8g8d" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.042584 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:15 crc kubenswrapper[4749]: E0320 07:16:15.042777 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:15.542753843 +0000 UTC m=+212.092411490 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.042835 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:15 crc kubenswrapper[4749]: E0320 07:16:15.043089 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:15.543077641 +0000 UTC m=+212.092735288 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.043139 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ea6b388a86d5204bb19cdebbfe5c0548a9be43b0dad71908a350b3a7079ddfec"} Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.043172 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a706cbb20b64806a329efddc2708bb399f1aa50332a27a30f0385dab229e86e3"} Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.051983 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvdhs" event={"ID":"6b71bdcd-f324-489c-a3ae-61ac7648b36a","Type":"ContainerStarted","Data":"6f592a1730b557b92f7255b61ef79485ae5d5a1779204ff80cc7e4419b2b923e"} Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.052637 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-phw2k" podStartSLOduration=154.052620017 podStartE2EDuration="2m34.052620017s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:15.049217069 +0000 UTC m=+211.598874716" watchObservedRunningTime="2026-03-20 07:16:15.052620017 +0000 UTC m=+211.602277664" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.056657 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lwsmc" event={"ID":"c03762c2-c2af-4472-abb5-5017f75e738f","Type":"ContainerStarted","Data":"7fa63517a5f5f4ec5158800d9570776e85588a005ec14cf79cae372af9db8982"} Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.065941 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-64rv4" event={"ID":"6df619ad-9a4e-4b8b-ba74-0d9b364bdc8d","Type":"ContainerStarted","Data":"dd12e46c5fd4b7ae280fe1539ec8fba440a412e2145330b6928353e5717d6f1a"} Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.071328 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"027d6844ff59ae3171ac25430eedc91de6a285e39528c7df20d6503472d91d22"} Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.071821 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.096651 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvdhs" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.097017 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-64rv4" podStartSLOduration=155.097008107 podStartE2EDuration="2m35.097008107s" podCreationTimestamp="2026-03-20 07:13:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:15.096877944 +0000 UTC m=+211.646535601" watchObservedRunningTime="2026-03-20 07:16:15.097008107 +0000 UTC m=+211.646665744" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.097806 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-hlw9w" podStartSLOduration=154.097801338 podStartE2EDuration="2m34.097801338s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:15.071964603 +0000 UTC m=+211.621622240" watchObservedRunningTime="2026-03-20 07:16:15.097801338 +0000 UTC m=+211.647458985" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.119943 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" event={"ID":"16b05ee8-fdde-4f11-936e-0982042ccfcf","Type":"ContainerStarted","Data":"d0086b87effa1d13fa72c8cd4ec85791ef98db6cea3bbaa956c1397d15881bb7"} Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.133492 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-lzpsv" event={"ID":"e2a05065-734d-4884-b037-c54ab87609eb","Type":"ContainerStarted","Data":"99fd7953f5815791c00f0e5709bb431226a588c5303b80f084c3515703610936"} Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.135148 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-vmnvn" event={"ID":"6799ceeb-28d4-4caf-97e4-e9115baae071","Type":"ContainerStarted","Data":"23fdfb91cd413f785b0a765ada2a0ce1efa18656a842edccb6b7b73099e30f36"} Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.146718 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:15 crc kubenswrapper[4749]: E0320 07:16:15.149006 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:15.648986543 +0000 UTC m=+212.198644190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.174700 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pwjp" event={"ID":"0121f83c-494b-40f1-9a70-65344ed716ad","Type":"ContainerStarted","Data":"28e4ff0f171f1a47b18ca0bb5bfef97699f87d77ab25f55ae7b2ba45e3a7d805"} Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.174751 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pwjp" event={"ID":"0121f83c-494b-40f1-9a70-65344ed716ad","Type":"ContainerStarted","Data":"5309ff71e5f72e4a3c91454d8cdc5291ef095174efdec9bea42e7cb527361a19"} Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.175421 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pwjp" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.207349 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lwsmc" podStartSLOduration=154.207329363 podStartE2EDuration="2m34.207329363s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:15.200865026 +0000 UTC m=+211.750522673" watchObservedRunningTime="2026-03-20 07:16:15.207329363 +0000 UTC m=+211.756987010" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.233672 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-l285h" event={"ID":"be9ae029-fa6d-44b8-9e03-af525859dd09","Type":"ContainerStarted","Data":"7f67a1dd802193df78e7c965609f8903ca14690713a422ed016b2287e7fdff81"} Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.237953 4749 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-j79zb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/healthz\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.238002 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" podUID="00730545-e9b7-4166-9f09-7a6fcac8cad3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.17:8080/healthz\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.247891 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-bqpst" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.269480 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gd9xh" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.270101 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n2fsv" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.270368 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-t8g8d" podStartSLOduration=8.270355243000001 podStartE2EDuration="8.270355243s" podCreationTimestamp="2026-03-20 07:16:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:15.239004767 +0000 UTC m=+211.788662414" watchObservedRunningTime="2026-03-20 07:16:15.270355243 +0000 UTC m=+211.820012890" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.333553 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.347313 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:15 crc kubenswrapper[4749]: E0320 07:16:15.358226 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:15.858211561 +0000 UTC m=+212.407869208 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.375647 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.381172 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-lzpsv" podStartSLOduration=154.381158371 podStartE2EDuration="2m34.381158371s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:15.378470622 +0000 UTC m=+211.928128269" watchObservedRunningTime="2026-03-20 07:16:15.381158371 +0000 UTC m=+211.930816018" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.452555 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.452622 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pwjp" podStartSLOduration=154.452603587 podStartE2EDuration="2m34.452603587s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:15.416658544 +0000 UTC m=+211.966316191" watchObservedRunningTime="2026-03-20 07:16:15.452603587 +0000 UTC m=+212.002261234" Mar 20 07:16:15 crc kubenswrapper[4749]: E0320 07:16:15.452757 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:15.952740951 +0000 UTC m=+212.502398598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.455003 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:15 crc kubenswrapper[4749]: E0320 07:16:15.455251 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:15.955240175 +0000 UTC m=+212.504897822 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.557099 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:15 crc kubenswrapper[4749]: E0320 07:16:15.557486 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:16.057467143 +0000 UTC m=+212.607124790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.571339 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rv5h9"] Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.572322 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rv5h9" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.575701 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.626231 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rv5h9"] Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.628264 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-vmnvn" podStartSLOduration=154.628244912 podStartE2EDuration="2m34.628244912s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:15.617063785 +0000 UTC m=+212.166721442" watchObservedRunningTime="2026-03-20 07:16:15.628244912 +0000 UTC m=+212.177902559" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.658405 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e5d15e-f3f5-4595-be01-ae4f196285ad-catalog-content\") pod \"community-operators-rv5h9\" (UID: \"b7e5d15e-f3f5-4595-be01-ae4f196285ad\") " pod="openshift-marketplace/community-operators-rv5h9" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.658452 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e5d15e-f3f5-4595-be01-ae4f196285ad-utilities\") pod \"community-operators-rv5h9\" (UID: \"b7e5d15e-f3f5-4595-be01-ae4f196285ad\") " pod="openshift-marketplace/community-operators-rv5h9" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.658484 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.658541 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgp9z\" (UniqueName: \"kubernetes.io/projected/b7e5d15e-f3f5-4595-be01-ae4f196285ad-kube-api-access-fgp9z\") pod \"community-operators-rv5h9\" (UID: \"b7e5d15e-f3f5-4595-be01-ae4f196285ad\") " pod="openshift-marketplace/community-operators-rv5h9" Mar 20 07:16:15 crc kubenswrapper[4749]: E0320 07:16:15.658827 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:16.158815688 +0000 UTC m=+212.708473335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.759663 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.759861 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgp9z\" (UniqueName: \"kubernetes.io/projected/b7e5d15e-f3f5-4595-be01-ae4f196285ad-kube-api-access-fgp9z\") pod \"community-operators-rv5h9\" (UID: \"b7e5d15e-f3f5-4595-be01-ae4f196285ad\") " pod="openshift-marketplace/community-operators-rv5h9" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.759906 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e5d15e-f3f5-4595-be01-ae4f196285ad-catalog-content\") pod \"community-operators-rv5h9\" (UID: \"b7e5d15e-f3f5-4595-be01-ae4f196285ad\") " pod="openshift-marketplace/community-operators-rv5h9" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.759937 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e5d15e-f3f5-4595-be01-ae4f196285ad-utilities\") pod \"community-operators-rv5h9\" (UID: \"b7e5d15e-f3f5-4595-be01-ae4f196285ad\") " pod="openshift-marketplace/community-operators-rv5h9" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.760385 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e5d15e-f3f5-4595-be01-ae4f196285ad-utilities\") pod \"community-operators-rv5h9\" (UID: \"b7e5d15e-f3f5-4595-be01-ae4f196285ad\") " pod="openshift-marketplace/community-operators-rv5h9" Mar 20 07:16:15 crc kubenswrapper[4749]: E0320 07:16:15.760471 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:16.26045392 +0000 UTC m=+212.810111567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.760900 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e5d15e-f3f5-4595-be01-ae4f196285ad-catalog-content\") pod \"community-operators-rv5h9\" (UID: \"b7e5d15e-f3f5-4595-be01-ae4f196285ad\") " pod="openshift-marketplace/community-operators-rv5h9" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.810240 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9ss5d"] Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.811060 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9ss5d" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.816809 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.841218 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgp9z\" (UniqueName: \"kubernetes.io/projected/b7e5d15e-f3f5-4595-be01-ae4f196285ad-kube-api-access-fgp9z\") pod \"community-operators-rv5h9\" (UID: \"b7e5d15e-f3f5-4595-be01-ae4f196285ad\") " pod="openshift-marketplace/community-operators-rv5h9" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.855338 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9ss5d"] Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.862860 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.862935 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8-utilities\") pod \"certified-operators-9ss5d\" (UID: \"a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8\") " pod="openshift-marketplace/certified-operators-9ss5d" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.862956 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d65vz\" (UniqueName: \"kubernetes.io/projected/a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8-kube-api-access-d65vz\") pod \"certified-operators-9ss5d\" (UID: \"a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8\") " pod="openshift-marketplace/certified-operators-9ss5d" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.862984 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8-catalog-content\") pod \"certified-operators-9ss5d\" (UID: \"a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8\") " pod="openshift-marketplace/certified-operators-9ss5d" Mar 20 07:16:15 crc kubenswrapper[4749]: E0320 07:16:15.863234 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:16.363223312 +0000 UTC m=+212.912880959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.929091 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rv5h9" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.965751 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:15 crc kubenswrapper[4749]: E0320 07:16:15.965873 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:16.46585872 +0000 UTC m=+213.015516367 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.966154 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8-utilities\") pod \"certified-operators-9ss5d\" (UID: \"a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8\") " pod="openshift-marketplace/certified-operators-9ss5d" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.966175 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d65vz\" (UniqueName: \"kubernetes.io/projected/a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8-kube-api-access-d65vz\") pod \"certified-operators-9ss5d\" (UID: \"a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8\") " pod="openshift-marketplace/certified-operators-9ss5d" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.966195 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8-catalog-content\") pod \"certified-operators-9ss5d\" (UID: \"a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8\") " pod="openshift-marketplace/certified-operators-9ss5d" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.966248 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.966819 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8-catalog-content\") pod \"certified-operators-9ss5d\" (UID: \"a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8\") " pod="openshift-marketplace/certified-operators-9ss5d" Mar 20 07:16:15 crc kubenswrapper[4749]: E0320 07:16:15.966968 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:16.466959619 +0000 UTC m=+213.016617266 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.967371 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8-utilities\") pod \"certified-operators-9ss5d\" (UID: \"a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8\") " pod="openshift-marketplace/certified-operators-9ss5d" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.981752 4749 patch_prober.go:28] interesting pod/router-default-5444994796-dbs7t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 07:16:15 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 20 07:16:15 crc kubenswrapper[4749]: [+]process-running ok Mar 20 07:16:15 crc kubenswrapper[4749]: healthz check failed Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.981810 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dbs7t" podUID="da62d543-787a-4364-8271-8f8f9529dd0c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 07:16:15 crc kubenswrapper[4749]: I0320 07:16:15.982487 4749 ???:1] "http: TLS handshake error from 192.168.126.11:34198: no serving certificate available for the kubelet" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:15.996257 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x9swj"] Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:15.997498 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x9swj" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.027650 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x9swj"] Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.048735 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d65vz\" (UniqueName: \"kubernetes.io/projected/a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8-kube-api-access-d65vz\") pod \"certified-operators-9ss5d\" (UID: \"a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8\") " pod="openshift-marketplace/certified-operators-9ss5d" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.071073 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:16 crc kubenswrapper[4749]: E0320 07:16:16.071371 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:16.571355351 +0000 UTC m=+213.121012988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.087431 4749 ???:1] "http: TLS handshake error from 192.168.126.11:34204: no serving certificate available for the kubelet" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.152030 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bbjcq"] Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.153025 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbjcq" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.169456 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bbjcq"] Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.170589 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9ss5d" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.171962 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.172007 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a22f47dc-59ce-4cce-821c-508fc14a9508-catalog-content\") pod \"community-operators-x9swj\" (UID: \"a22f47dc-59ce-4cce-821c-508fc14a9508\") " pod="openshift-marketplace/community-operators-x9swj" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.172058 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a22f47dc-59ce-4cce-821c-508fc14a9508-utilities\") pod \"community-operators-x9swj\" (UID: \"a22f47dc-59ce-4cce-821c-508fc14a9508\") " pod="openshift-marketplace/community-operators-x9swj" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.172100 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk8jn\" (UniqueName: \"kubernetes.io/projected/a22f47dc-59ce-4cce-821c-508fc14a9508-kube-api-access-hk8jn\") pod \"community-operators-x9swj\" (UID: \"a22f47dc-59ce-4cce-821c-508fc14a9508\") " pod="openshift-marketplace/community-operators-x9swj" Mar 20 07:16:16 crc kubenswrapper[4749]: E0320 07:16:16.172386 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:16.672374818 +0000 UTC m=+213.222032465 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.199269 4749 ???:1] "http: TLS handshake error from 192.168.126.11:34208: no serving certificate available for the kubelet" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.266603 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"f5d55a0e7a3257cf1102ce1e822158ede2a60f57196d4817bf520c9643aa2383"} Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.272916 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.273074 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0-utilities\") pod \"certified-operators-bbjcq\" (UID: \"ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0\") " pod="openshift-marketplace/certified-operators-bbjcq" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.273111 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a22f47dc-59ce-4cce-821c-508fc14a9508-utilities\") pod \"community-operators-x9swj\" (UID: \"a22f47dc-59ce-4cce-821c-508fc14a9508\") " pod="openshift-marketplace/community-operators-x9swj" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.273149 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckwfm\" (UniqueName: \"kubernetes.io/projected/ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0-kube-api-access-ckwfm\") pod \"certified-operators-bbjcq\" (UID: \"ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0\") " pod="openshift-marketplace/certified-operators-bbjcq" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.273170 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0-catalog-content\") pod \"certified-operators-bbjcq\" (UID: \"ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0\") " pod="openshift-marketplace/certified-operators-bbjcq" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.273189 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk8jn\" (UniqueName: \"kubernetes.io/projected/a22f47dc-59ce-4cce-821c-508fc14a9508-kube-api-access-hk8jn\") pod \"community-operators-x9swj\" (UID: \"a22f47dc-59ce-4cce-821c-508fc14a9508\") " pod="openshift-marketplace/community-operators-x9swj" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.273221 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a22f47dc-59ce-4cce-821c-508fc14a9508-catalog-content\") pod \"community-operators-x9swj\" (UID: \"a22f47dc-59ce-4cce-821c-508fc14a9508\") " pod="openshift-marketplace/community-operators-x9swj" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.273596 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a22f47dc-59ce-4cce-821c-508fc14a9508-catalog-content\") pod \"community-operators-x9swj\" (UID: \"a22f47dc-59ce-4cce-821c-508fc14a9508\") " pod="openshift-marketplace/community-operators-x9swj" Mar 20 07:16:16 crc kubenswrapper[4749]: E0320 07:16:16.273668 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:16.773652932 +0000 UTC m=+213.323310579 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.273976 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a22f47dc-59ce-4cce-821c-508fc14a9508-utilities\") pod \"community-operators-x9swj\" (UID: \"a22f47dc-59ce-4cce-821c-508fc14a9508\") " pod="openshift-marketplace/community-operators-x9swj" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.275955 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k56zh" event={"ID":"6d19b89e-d048-4656-b5ce-c637190ab678","Type":"ContainerStarted","Data":"f16be1ec0433e171543d3e0f49d918605783ec39e18e4605eab8fcdb2277c488"} Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.276005 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k56zh" event={"ID":"6d19b89e-d048-4656-b5ce-c637190ab678","Type":"ContainerStarted","Data":"5a614791935b9131302aeca0ad99a34e49ce39d1667fe4945e107e9c4403c9b1"} Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.283986 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"afb763d055da04810fc3459fa28d02651f9b709ec757f0d85d5877476ad53710"} Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.289009 4749 ???:1] "http: TLS handshake error from 192.168.126.11:34222: no serving certificate available for the kubelet" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.296984 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.297166 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zznkl" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.303478 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk8jn\" (UniqueName: \"kubernetes.io/projected/a22f47dc-59ce-4cce-821c-508fc14a9508-kube-api-access-hk8jn\") pod \"community-operators-x9swj\" (UID: \"a22f47dc-59ce-4cce-821c-508fc14a9508\") " pod="openshift-marketplace/community-operators-x9swj" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.344734 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-k56zh" podStartSLOduration=155.344717288 podStartE2EDuration="2m35.344717288s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:16.307805229 +0000 UTC m=+212.857462876" watchObservedRunningTime="2026-03-20 07:16:16.344717288 +0000 UTC m=+212.894374935" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.377251 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckwfm\" (UniqueName: \"kubernetes.io/projected/ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0-kube-api-access-ckwfm\") pod \"certified-operators-bbjcq\" (UID: \"ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0\") " pod="openshift-marketplace/certified-operators-bbjcq" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.377804 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0-catalog-content\") pod \"certified-operators-bbjcq\" (UID: \"ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0\") " pod="openshift-marketplace/certified-operators-bbjcq" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.377917 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.379768 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0-catalog-content\") pod \"certified-operators-bbjcq\" (UID: \"ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0\") " pod="openshift-marketplace/certified-operators-bbjcq" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.382768 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x9swj" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.383976 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0-utilities\") pod \"certified-operators-bbjcq\" (UID: \"ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0\") " pod="openshift-marketplace/certified-operators-bbjcq" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.384205 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0-utilities\") pod \"certified-operators-bbjcq\" (UID: \"ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0\") " pod="openshift-marketplace/certified-operators-bbjcq" Mar 20 07:16:16 crc kubenswrapper[4749]: E0320 07:16:16.384960 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:16.884945692 +0000 UTC m=+213.434603339 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.430274 4749 ???:1] "http: TLS handshake error from 192.168.126.11:34226: no serving certificate available for the kubelet" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.440233 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckwfm\" (UniqueName: \"kubernetes.io/projected/ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0-kube-api-access-ckwfm\") pod \"certified-operators-bbjcq\" (UID: \"ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0\") " pod="openshift-marketplace/certified-operators-bbjcq" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.472998 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbjcq" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.487074 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:16 crc kubenswrapper[4749]: E0320 07:16:16.487433 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:16.987418076 +0000 UTC m=+213.537075723 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.537610 4749 ???:1] "http: TLS handshake error from 192.168.126.11:34228: no serving certificate available for the kubelet" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.589960 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:16 crc kubenswrapper[4749]: E0320 07:16:16.590487 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:17.090475965 +0000 UTC m=+213.640133612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.680695 4749 ???:1] "http: TLS handshake error from 192.168.126.11:34230: no serving certificate available for the kubelet" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.692856 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:16 crc kubenswrapper[4749]: E0320 07:16:16.693201 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:17.193185455 +0000 UTC m=+213.742843102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.723403 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rv5h9"] Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.799515 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:16 crc kubenswrapper[4749]: E0320 07:16:16.799809 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:17.299794485 +0000 UTC m=+213.849452132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.823765 4749 ???:1] "http: TLS handshake error from 192.168.126.11:34244: no serving certificate available for the kubelet" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.833472 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9ss5d"] Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.902738 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:16 crc kubenswrapper[4749]: E0320 07:16:16.903212 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:17.403198313 +0000 UTC m=+213.952855960 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.913200 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hvf29"] Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.957393 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf"] Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.957970 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" podUID="4f9b9110-5f21-4d71-ac4e-61e0ff6b1899" containerName="route-controller-manager" containerID="cri-o://359237dd72e0529d1d892951a9f6ef7a321163f81dd3b69bb08b325231498b3e" gracePeriod=30 Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.960067 4749 patch_prober.go:28] interesting pod/router-default-5444994796-dbs7t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 07:16:16 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 20 07:16:16 crc kubenswrapper[4749]: [+]process-running ok Mar 20 07:16:16 crc kubenswrapper[4749]: healthz check failed Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.960095 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dbs7t" podUID="da62d543-787a-4364-8271-8f8f9529dd0c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 07:16:16 crc kubenswrapper[4749]: I0320 07:16:16.994256 4749 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.004916 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:17 crc kubenswrapper[4749]: E0320 07:16:17.005216 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:17.505205685 +0000 UTC m=+214.054863332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.107823 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:17 crc kubenswrapper[4749]: E0320 07:16:17.108404 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:17.608389767 +0000 UTC m=+214.158047414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.170775 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x9swj"] Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.214639 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:17 crc kubenswrapper[4749]: E0320 07:16:17.214973 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:17.714962737 +0000 UTC m=+214.264620384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:17 crc kubenswrapper[4749]: W0320 07:16:17.239071 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda22f47dc_59ce_4cce_821c_508fc14a9508.slice/crio-1c432bbc90f2ad3798c358ce4651b7f7d73cfbc3b8eef9d4ead6ab769c504ced WatchSource:0}: Error finding container 1c432bbc90f2ad3798c358ce4651b7f7d73cfbc3b8eef9d4ead6ab769c504ced: Status 404 returned error can't find the container with id 1c432bbc90f2ad3798c358ce4651b7f7d73cfbc3b8eef9d4ead6ab769c504ced Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.293930 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9swj" event={"ID":"a22f47dc-59ce-4cce-821c-508fc14a9508","Type":"ContainerStarted","Data":"1c432bbc90f2ad3798c358ce4651b7f7d73cfbc3b8eef9d4ead6ab769c504ced"} Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.297798 4749 generic.go:334] "Generic (PLEG): container finished" podID="b7e5d15e-f3f5-4595-be01-ae4f196285ad" containerID="f874c7a512f959ae6157a0fad44f63e0445599ccb8af9ce663a82dd65f470823" exitCode=0 Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.297871 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rv5h9" event={"ID":"b7e5d15e-f3f5-4595-be01-ae4f196285ad","Type":"ContainerDied","Data":"f874c7a512f959ae6157a0fad44f63e0445599ccb8af9ce663a82dd65f470823"} Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.297889 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rv5h9" event={"ID":"b7e5d15e-f3f5-4595-be01-ae4f196285ad","Type":"ContainerStarted","Data":"0cdc74c769040e7772b558188d153b2ac3a0fb715d4a1f99d22c13ca1b5d0be3"} Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.300957 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" event={"ID":"16b05ee8-fdde-4f11-936e-0982042ccfcf","Type":"ContainerStarted","Data":"f2906c3b6cf67a61dbfb494293830608724cf937009ab62ba7a49215eb77895c"} Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.301025 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" event={"ID":"16b05ee8-fdde-4f11-936e-0982042ccfcf","Type":"ContainerStarted","Data":"594dbfef30acbf20fbd4fb872f684eb9f02ca73f5c57f64a54a7409f3c79152e"} Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.307533 4749 generic.go:334] "Generic (PLEG): container finished" podID="4f9b9110-5f21-4d71-ac4e-61e0ff6b1899" containerID="359237dd72e0529d1d892951a9f6ef7a321163f81dd3b69bb08b325231498b3e" exitCode=0 Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.307611 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" event={"ID":"4f9b9110-5f21-4d71-ac4e-61e0ff6b1899","Type":"ContainerDied","Data":"359237dd72e0529d1d892951a9f6ef7a321163f81dd3b69bb08b325231498b3e"} Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.323577 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:17 crc kubenswrapper[4749]: E0320 07:16:17.323935 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:17.823920677 +0000 UTC m=+214.373578324 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.337617 4749 generic.go:334] "Generic (PLEG): container finished" podID="a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8" containerID="14d34f2b700101fc81e29ffcc36f01ece29b230336a477e58c063cc0be27c1ba" exitCode=0 Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.338179 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ss5d" event={"ID":"a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8","Type":"ContainerDied","Data":"14d34f2b700101fc81e29ffcc36f01ece29b230336a477e58c063cc0be27c1ba"} Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.340581 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ss5d" event={"ID":"a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8","Type":"ContainerStarted","Data":"e497ff3133bd24ea6fe59a5158b6eff67216c2c81bf98ac7d6188c109f4be5c2"} Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.414152 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.425237 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:17 crc kubenswrapper[4749]: E0320 07:16:17.428304 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:17.928275469 +0000 UTC m=+214.477933116 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.456489 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bbjcq"] Mar 20 07:16:17 crc kubenswrapper[4749]: W0320 07:16:17.469322 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff5144b0_e9ee_4e0a_bd55_51d8a1cd7ad0.slice/crio-3423a7f9f90c2055805b9647f76845cb1c3e89ec49178df9b4556f1322008e29 WatchSource:0}: Error finding container 3423a7f9f90c2055805b9647f76845cb1c3e89ec49178df9b4556f1322008e29: Status 404 returned error can't find the container with id 3423a7f9f90c2055805b9647f76845cb1c3e89ec49178df9b4556f1322008e29 Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.499294 4749 ???:1] "http: TLS handshake error from 192.168.126.11:34254: no serving certificate available for the kubelet" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.525791 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899-config\") pod \"4f9b9110-5f21-4d71-ac4e-61e0ff6b1899\" (UID: \"4f9b9110-5f21-4d71-ac4e-61e0ff6b1899\") " Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.525896 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-742n8\" (UniqueName: \"kubernetes.io/projected/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899-kube-api-access-742n8\") pod \"4f9b9110-5f21-4d71-ac4e-61e0ff6b1899\" (UID: \"4f9b9110-5f21-4d71-ac4e-61e0ff6b1899\") " Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.525984 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899-client-ca\") pod \"4f9b9110-5f21-4d71-ac4e-61e0ff6b1899\" (UID: \"4f9b9110-5f21-4d71-ac4e-61e0ff6b1899\") " Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.526014 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899-serving-cert\") pod \"4f9b9110-5f21-4d71-ac4e-61e0ff6b1899\" (UID: \"4f9b9110-5f21-4d71-ac4e-61e0ff6b1899\") " Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.526124 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:17 crc kubenswrapper[4749]: E0320 07:16:17.526412 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:18.026390721 +0000 UTC m=+214.576048368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.527251 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899-client-ca" (OuterVolumeSpecName: "client-ca") pod "4f9b9110-5f21-4d71-ac4e-61e0ff6b1899" (UID: "4f9b9110-5f21-4d71-ac4e-61e0ff6b1899"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.527337 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899-config" (OuterVolumeSpecName: "config") pod "4f9b9110-5f21-4d71-ac4e-61e0ff6b1899" (UID: "4f9b9110-5f21-4d71-ac4e-61e0ff6b1899"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.533421 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4f9b9110-5f21-4d71-ac4e-61e0ff6b1899" (UID: "4f9b9110-5f21-4d71-ac4e-61e0ff6b1899"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.534688 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899-kube-api-access-742n8" (OuterVolumeSpecName: "kube-api-access-742n8") pod "4f9b9110-5f21-4d71-ac4e-61e0ff6b1899" (UID: "4f9b9110-5f21-4d71-ac4e-61e0ff6b1899"). InnerVolumeSpecName "kube-api-access-742n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.627095 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.627216 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-742n8\" (UniqueName: \"kubernetes.io/projected/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899-kube-api-access-742n8\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.627235 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.627248 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.627258 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:17 crc kubenswrapper[4749]: E0320 07:16:17.627585 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:18.127571903 +0000 UTC m=+214.677229550 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.728570 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:17 crc kubenswrapper[4749]: E0320 07:16:17.728745 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 07:16:18.228719462 +0000 UTC m=+214.778377099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.728871 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:17 crc kubenswrapper[4749]: E0320 07:16:17.729243 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 07:16:18.229233585 +0000 UTC m=+214.778891232 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zz6kk" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.739095 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l72fj"] Mar 20 07:16:17 crc kubenswrapper[4749]: E0320 07:16:17.739641 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f9b9110-5f21-4d71-ac4e-61e0ff6b1899" containerName="route-controller-manager" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.739763 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f9b9110-5f21-4d71-ac4e-61e0ff6b1899" containerName="route-controller-manager" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.740014 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f9b9110-5f21-4d71-ac4e-61e0ff6b1899" containerName="route-controller-manager" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.742556 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l72fj" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.751252 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.766665 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l72fj"] Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.780363 4749 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T07:16:16.994291774Z","Handler":null,"Name":""} Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.819487 4749 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.819562 4749 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.829790 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.830003 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9r2c\" (UniqueName: \"kubernetes.io/projected/9c486dab-86dd-44dd-8c82-4c07ed84aa50-kube-api-access-c9r2c\") pod \"redhat-marketplace-l72fj\" (UID: \"9c486dab-86dd-44dd-8c82-4c07ed84aa50\") " pod="openshift-marketplace/redhat-marketplace-l72fj" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.830041 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c486dab-86dd-44dd-8c82-4c07ed84aa50-utilities\") pod \"redhat-marketplace-l72fj\" (UID: \"9c486dab-86dd-44dd-8c82-4c07ed84aa50\") " pod="openshift-marketplace/redhat-marketplace-l72fj" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.830076 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c486dab-86dd-44dd-8c82-4c07ed84aa50-catalog-content\") pod \"redhat-marketplace-l72fj\" (UID: \"9c486dab-86dd-44dd-8c82-4c07ed84aa50\") " pod="openshift-marketplace/redhat-marketplace-l72fj" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.835053 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.931189 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.931258 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9r2c\" (UniqueName: \"kubernetes.io/projected/9c486dab-86dd-44dd-8c82-4c07ed84aa50-kube-api-access-c9r2c\") pod \"redhat-marketplace-l72fj\" (UID: \"9c486dab-86dd-44dd-8c82-4c07ed84aa50\") " pod="openshift-marketplace/redhat-marketplace-l72fj" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.931394 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c486dab-86dd-44dd-8c82-4c07ed84aa50-utilities\") pod \"redhat-marketplace-l72fj\" (UID: \"9c486dab-86dd-44dd-8c82-4c07ed84aa50\") " pod="openshift-marketplace/redhat-marketplace-l72fj" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.931440 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c486dab-86dd-44dd-8c82-4c07ed84aa50-catalog-content\") pod \"redhat-marketplace-l72fj\" (UID: \"9c486dab-86dd-44dd-8c82-4c07ed84aa50\") " pod="openshift-marketplace/redhat-marketplace-l72fj" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.932302 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c486dab-86dd-44dd-8c82-4c07ed84aa50-utilities\") pod \"redhat-marketplace-l72fj\" (UID: \"9c486dab-86dd-44dd-8c82-4c07ed84aa50\") " pod="openshift-marketplace/redhat-marketplace-l72fj" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.932381 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c486dab-86dd-44dd-8c82-4c07ed84aa50-catalog-content\") pod \"redhat-marketplace-l72fj\" (UID: \"9c486dab-86dd-44dd-8c82-4c07ed84aa50\") " pod="openshift-marketplace/redhat-marketplace-l72fj" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.933702 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.933738 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.958202 4749 patch_prober.go:28] interesting pod/router-default-5444994796-dbs7t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 07:16:17 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 20 07:16:17 crc kubenswrapper[4749]: [+]process-running ok Mar 20 07:16:17 crc kubenswrapper[4749]: healthz check failed Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.958253 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dbs7t" podUID="da62d543-787a-4364-8271-8f8f9529dd0c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.961953 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9r2c\" (UniqueName: \"kubernetes.io/projected/9c486dab-86dd-44dd-8c82-4c07ed84aa50-kube-api-access-c9r2c\") pod \"redhat-marketplace-l72fj\" (UID: \"9c486dab-86dd-44dd-8c82-4c07ed84aa50\") " pod="openshift-marketplace/redhat-marketplace-l72fj" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.974137 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zz6kk\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:17 crc kubenswrapper[4749]: I0320 07:16:17.997816 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.091620 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l72fj" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.146275 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-szw2w"] Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.147260 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szw2w" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.165390 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-szw2w"] Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.189535 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.212185 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zz6kk"] Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.235800 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd2462fa-d077-4466-8930-6f2e69938c1b-utilities\") pod \"redhat-marketplace-szw2w\" (UID: \"cd2462fa-d077-4466-8930-6f2e69938c1b\") " pod="openshift-marketplace/redhat-marketplace-szw2w" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.238100 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd2462fa-d077-4466-8930-6f2e69938c1b-catalog-content\") pod \"redhat-marketplace-szw2w\" (UID: \"cd2462fa-d077-4466-8930-6f2e69938c1b\") " pod="openshift-marketplace/redhat-marketplace-szw2w" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.238194 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txjch\" (UniqueName: \"kubernetes.io/projected/cd2462fa-d077-4466-8930-6f2e69938c1b-kube-api-access-txjch\") pod \"redhat-marketplace-szw2w\" (UID: \"cd2462fa-d077-4466-8930-6f2e69938c1b\") " pod="openshift-marketplace/redhat-marketplace-szw2w" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.338806 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd2462fa-d077-4466-8930-6f2e69938c1b-utilities\") pod \"redhat-marketplace-szw2w\" (UID: \"cd2462fa-d077-4466-8930-6f2e69938c1b\") " pod="openshift-marketplace/redhat-marketplace-szw2w" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.338865 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd2462fa-d077-4466-8930-6f2e69938c1b-catalog-content\") pod \"redhat-marketplace-szw2w\" (UID: \"cd2462fa-d077-4466-8930-6f2e69938c1b\") " pod="openshift-marketplace/redhat-marketplace-szw2w" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.338919 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txjch\" (UniqueName: \"kubernetes.io/projected/cd2462fa-d077-4466-8930-6f2e69938c1b-kube-api-access-txjch\") pod \"redhat-marketplace-szw2w\" (UID: \"cd2462fa-d077-4466-8930-6f2e69938c1b\") " pod="openshift-marketplace/redhat-marketplace-szw2w" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.340602 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd2462fa-d077-4466-8930-6f2e69938c1b-utilities\") pod \"redhat-marketplace-szw2w\" (UID: \"cd2462fa-d077-4466-8930-6f2e69938c1b\") " pod="openshift-marketplace/redhat-marketplace-szw2w" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.341371 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd2462fa-d077-4466-8930-6f2e69938c1b-catalog-content\") pod \"redhat-marketplace-szw2w\" (UID: \"cd2462fa-d077-4466-8930-6f2e69938c1b\") " pod="openshift-marketplace/redhat-marketplace-szw2w" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.349746 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" event={"ID":"4f9b9110-5f21-4d71-ac4e-61e0ff6b1899","Type":"ContainerDied","Data":"e0f33c911e139840a1235c5f3c91e2668c2965240297820c5ba107f1bec7e364"} Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.349805 4749 scope.go:117] "RemoveContainer" containerID="359237dd72e0529d1d892951a9f6ef7a321163f81dd3b69bb08b325231498b3e" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.349953 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.364322 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txjch\" (UniqueName: \"kubernetes.io/projected/cd2462fa-d077-4466-8930-6f2e69938c1b-kube-api-access-txjch\") pod \"redhat-marketplace-szw2w\" (UID: \"cd2462fa-d077-4466-8930-6f2e69938c1b\") " pod="openshift-marketplace/redhat-marketplace-szw2w" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.365753 4749 generic.go:334] "Generic (PLEG): container finished" podID="ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0" containerID="230e7266060294fea9bd12b211aabca3ce92e4a4179d32772a8e624a999db801" exitCode=0 Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.365844 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbjcq" event={"ID":"ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0","Type":"ContainerDied","Data":"230e7266060294fea9bd12b211aabca3ce92e4a4179d32772a8e624a999db801"} Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.365871 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbjcq" event={"ID":"ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0","Type":"ContainerStarted","Data":"3423a7f9f90c2055805b9647f76845cb1c3e89ec49178df9b4556f1322008e29"} Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.374636 4749 generic.go:334] "Generic (PLEG): container finished" podID="a22f47dc-59ce-4cce-821c-508fc14a9508" containerID="6c42fd191ad1d44d691538c5f9f060bd726611fc36cd413387807f41709e1035" exitCode=0 Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.374695 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9swj" event={"ID":"a22f47dc-59ce-4cce-821c-508fc14a9508","Type":"ContainerDied","Data":"6c42fd191ad1d44d691538c5f9f060bd726611fc36cd413387807f41709e1035"} Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.378163 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf"] Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.381056 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvtqf"] Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.389054 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" event={"ID":"16b05ee8-fdde-4f11-936e-0982042ccfcf","Type":"ContainerStarted","Data":"a330b43c412d42b1a71a8f1bbb62e409cb2e73d1872daf17c98d5fe0e55d6ae7"} Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.395960 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" podUID="bd1003fd-4300-423c-b500-e782a8aeb7bb" containerName="controller-manager" containerID="cri-o://fbbcf5253dbefa7a01d04ea62ce3369e448d83bba300dfaf0b5689eacc8c9959" gracePeriod=30 Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.396931 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" event={"ID":"473085e8-ee17-4244-abd0-dcf2308b4655","Type":"ContainerStarted","Data":"0fbe29db15f5ac137c6b49137c50d1e1112b8b4b098a29b89bd6f3bebb487a8c"} Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.424159 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-8q7f7" podStartSLOduration=11.424145228 podStartE2EDuration="11.424145228s" podCreationTimestamp="2026-03-20 07:16:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:18.422356071 +0000 UTC m=+214.972013718" watchObservedRunningTime="2026-03-20 07:16:18.424145228 +0000 UTC m=+214.973802875" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.470824 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szw2w" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.536007 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l72fj"] Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.701819 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-szw2w"] Mar 20 07:16:18 crc kubenswrapper[4749]: W0320 07:16:18.718687 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd2462fa_d077_4466_8930_6f2e69938c1b.slice/crio-1bceb1f83fc2093b7e14b4c6ad527ca3b6aed59db619c4cd190ab0f3b9a635b2 WatchSource:0}: Error finding container 1bceb1f83fc2093b7e14b4c6ad527ca3b6aed59db619c4cd190ab0f3b9a635b2: Status 404 returned error can't find the container with id 1bceb1f83fc2093b7e14b4c6ad527ca3b6aed59db619c4cd190ab0f3b9a635b2 Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.745236 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m7xc9"] Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.747139 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m7xc9" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.755533 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.765349 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m7xc9"] Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.772759 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.774092 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.776361 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.776939 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.780044 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.808945 4749 ???:1] "http: TLS handshake error from 192.168.126.11:34262: no serving certificate available for the kubelet" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.856570 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aa5c3d7c-62dd-441e-a0e2-89e2eae56970-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"aa5c3d7c-62dd-441e-a0e2-89e2eae56970\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.856941 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8faad596-00ed-4982-9f42-2f1a2465098c-utilities\") pod \"redhat-operators-m7xc9\" (UID: \"8faad596-00ed-4982-9f42-2f1a2465098c\") " pod="openshift-marketplace/redhat-operators-m7xc9" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.856974 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8faad596-00ed-4982-9f42-2f1a2465098c-catalog-content\") pod \"redhat-operators-m7xc9\" (UID: \"8faad596-00ed-4982-9f42-2f1a2465098c\") " pod="openshift-marketplace/redhat-operators-m7xc9" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.857015 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa5c3d7c-62dd-441e-a0e2-89e2eae56970-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"aa5c3d7c-62dd-441e-a0e2-89e2eae56970\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.857053 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4zwd\" (UniqueName: \"kubernetes.io/projected/8faad596-00ed-4982-9f42-2f1a2465098c-kube-api-access-d4zwd\") pod \"redhat-operators-m7xc9\" (UID: \"8faad596-00ed-4982-9f42-2f1a2465098c\") " pod="openshift-marketplace/redhat-operators-m7xc9" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.961884 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aa5c3d7c-62dd-441e-a0e2-89e2eae56970-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"aa5c3d7c-62dd-441e-a0e2-89e2eae56970\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.961953 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8faad596-00ed-4982-9f42-2f1a2465098c-utilities\") pod \"redhat-operators-m7xc9\" (UID: \"8faad596-00ed-4982-9f42-2f1a2465098c\") " pod="openshift-marketplace/redhat-operators-m7xc9" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.961989 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8faad596-00ed-4982-9f42-2f1a2465098c-catalog-content\") pod \"redhat-operators-m7xc9\" (UID: \"8faad596-00ed-4982-9f42-2f1a2465098c\") " pod="openshift-marketplace/redhat-operators-m7xc9" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.962023 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa5c3d7c-62dd-441e-a0e2-89e2eae56970-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"aa5c3d7c-62dd-441e-a0e2-89e2eae56970\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.962110 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4zwd\" (UniqueName: \"kubernetes.io/projected/8faad596-00ed-4982-9f42-2f1a2465098c-kube-api-access-d4zwd\") pod \"redhat-operators-m7xc9\" (UID: \"8faad596-00ed-4982-9f42-2f1a2465098c\") " pod="openshift-marketplace/redhat-operators-m7xc9" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.962391 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aa5c3d7c-62dd-441e-a0e2-89e2eae56970-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"aa5c3d7c-62dd-441e-a0e2-89e2eae56970\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.962843 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8faad596-00ed-4982-9f42-2f1a2465098c-utilities\") pod \"redhat-operators-m7xc9\" (UID: \"8faad596-00ed-4982-9f42-2f1a2465098c\") " pod="openshift-marketplace/redhat-operators-m7xc9" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.963006 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8faad596-00ed-4982-9f42-2f1a2465098c-catalog-content\") pod \"redhat-operators-m7xc9\" (UID: \"8faad596-00ed-4982-9f42-2f1a2465098c\") " pod="openshift-marketplace/redhat-operators-m7xc9" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.964914 4749 patch_prober.go:28] interesting pod/router-default-5444994796-dbs7t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 07:16:18 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 20 07:16:18 crc kubenswrapper[4749]: [+]process-running ok Mar 20 07:16:18 crc kubenswrapper[4749]: healthz check failed Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.965005 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dbs7t" podUID="da62d543-787a-4364-8271-8f8f9529dd0c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.966317 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.982991 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa5c3d7c-62dd-441e-a0e2-89e2eae56970-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"aa5c3d7c-62dd-441e-a0e2-89e2eae56970\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 07:16:18 crc kubenswrapper[4749]: I0320 07:16:18.988897 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4zwd\" (UniqueName: \"kubernetes.io/projected/8faad596-00ed-4982-9f42-2f1a2465098c-kube-api-access-d4zwd\") pod \"redhat-operators-m7xc9\" (UID: \"8faad596-00ed-4982-9f42-2f1a2465098c\") " pod="openshift-marketplace/redhat-operators-m7xc9" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.016537 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch"] Mar 20 07:16:19 crc kubenswrapper[4749]: E0320 07:16:19.016760 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd1003fd-4300-423c-b500-e782a8aeb7bb" containerName="controller-manager" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.016775 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1003fd-4300-423c-b500-e782a8aeb7bb" containerName="controller-manager" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.016863 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd1003fd-4300-423c-b500-e782a8aeb7bb" containerName="controller-manager" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.017228 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.032503 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch"] Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.032937 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.033385 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.033592 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.033881 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.034113 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.034410 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.043169 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.052043 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.054230 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.056654 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.062392 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.063152 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd1003fd-4300-423c-b500-e782a8aeb7bb-client-ca\") pod \"bd1003fd-4300-423c-b500-e782a8aeb7bb\" (UID: \"bd1003fd-4300-423c-b500-e782a8aeb7bb\") " Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.063197 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd1003fd-4300-423c-b500-e782a8aeb7bb-serving-cert\") pod \"bd1003fd-4300-423c-b500-e782a8aeb7bb\" (UID: \"bd1003fd-4300-423c-b500-e782a8aeb7bb\") " Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.063238 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd1003fd-4300-423c-b500-e782a8aeb7bb-proxy-ca-bundles\") pod \"bd1003fd-4300-423c-b500-e782a8aeb7bb\" (UID: \"bd1003fd-4300-423c-b500-e782a8aeb7bb\") " Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.063306 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd1003fd-4300-423c-b500-e782a8aeb7bb-config\") pod \"bd1003fd-4300-423c-b500-e782a8aeb7bb\" (UID: \"bd1003fd-4300-423c-b500-e782a8aeb7bb\") " Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.063358 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrjz4\" (UniqueName: \"kubernetes.io/projected/bd1003fd-4300-423c-b500-e782a8aeb7bb-kube-api-access-wrjz4\") pod \"bd1003fd-4300-423c-b500-e782a8aeb7bb\" (UID: \"bd1003fd-4300-423c-b500-e782a8aeb7bb\") " Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.063902 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd1003fd-4300-423c-b500-e782a8aeb7bb-client-ca" (OuterVolumeSpecName: "client-ca") pod "bd1003fd-4300-423c-b500-e782a8aeb7bb" (UID: "bd1003fd-4300-423c-b500-e782a8aeb7bb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.064077 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd1003fd-4300-423c-b500-e782a8aeb7bb-config" (OuterVolumeSpecName: "config") pod "bd1003fd-4300-423c-b500-e782a8aeb7bb" (UID: "bd1003fd-4300-423c-b500-e782a8aeb7bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.064618 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd1003fd-4300-423c-b500-e782a8aeb7bb-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "bd1003fd-4300-423c-b500-e782a8aeb7bb" (UID: "bd1003fd-4300-423c-b500-e782a8aeb7bb"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.075455 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd1003fd-4300-423c-b500-e782a8aeb7bb-kube-api-access-wrjz4" (OuterVolumeSpecName: "kube-api-access-wrjz4") pod "bd1003fd-4300-423c-b500-e782a8aeb7bb" (UID: "bd1003fd-4300-423c-b500-e782a8aeb7bb"). InnerVolumeSpecName "kube-api-access-wrjz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.078530 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1003fd-4300-423c-b500-e782a8aeb7bb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bd1003fd-4300-423c-b500-e782a8aeb7bb" (UID: "bd1003fd-4300-423c-b500-e782a8aeb7bb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.141523 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lq4rb"] Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.142508 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lq4rb" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.152254 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lq4rb"] Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.168892 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d83d04c8-afe2-49f9-b0ab-f63d9f79544e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d83d04c8-afe2-49f9-b0ab-f63d9f79544e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.168946 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d83d04c8-afe2-49f9-b0ab-f63d9f79544e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d83d04c8-afe2-49f9-b0ab-f63d9f79544e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.168969 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd-client-ca\") pod \"route-controller-manager-6c474dd6b9-fgnch\" (UID: \"7b3e2383-a11b-4ef1-a2be-6d75f4b1babd\") " pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.168989 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjv6b\" (UniqueName: \"kubernetes.io/projected/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd-kube-api-access-rjv6b\") pod \"route-controller-manager-6c474dd6b9-fgnch\" (UID: \"7b3e2383-a11b-4ef1-a2be-6d75f4b1babd\") " pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.169017 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd-config\") pod \"route-controller-manager-6c474dd6b9-fgnch\" (UID: \"7b3e2383-a11b-4ef1-a2be-6d75f4b1babd\") " pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.169039 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd-serving-cert\") pod \"route-controller-manager-6c474dd6b9-fgnch\" (UID: \"7b3e2383-a11b-4ef1-a2be-6d75f4b1babd\") " pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.169080 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd1003fd-4300-423c-b500-e782a8aeb7bb-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.169092 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrjz4\" (UniqueName: \"kubernetes.io/projected/bd1003fd-4300-423c-b500-e782a8aeb7bb-kube-api-access-wrjz4\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.169103 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd1003fd-4300-423c-b500-e782a8aeb7bb-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.169115 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd1003fd-4300-423c-b500-e782a8aeb7bb-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.169128 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd1003fd-4300-423c-b500-e782a8aeb7bb-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.171589 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m7xc9" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.174864 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.179596 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-9gg28" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.259880 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.270380 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlzhl\" (UniqueName: \"kubernetes.io/projected/937dac41-5afa-495a-9909-1152a419549c-kube-api-access-rlzhl\") pod \"redhat-operators-lq4rb\" (UID: \"937dac41-5afa-495a-9909-1152a419549c\") " pod="openshift-marketplace/redhat-operators-lq4rb" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.270690 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d83d04c8-afe2-49f9-b0ab-f63d9f79544e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d83d04c8-afe2-49f9-b0ab-f63d9f79544e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.270729 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd-client-ca\") pod \"route-controller-manager-6c474dd6b9-fgnch\" (UID: \"7b3e2383-a11b-4ef1-a2be-6d75f4b1babd\") " pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.270750 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjv6b\" (UniqueName: \"kubernetes.io/projected/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd-kube-api-access-rjv6b\") pod \"route-controller-manager-6c474dd6b9-fgnch\" (UID: \"7b3e2383-a11b-4ef1-a2be-6d75f4b1babd\") " pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.270802 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd-config\") pod \"route-controller-manager-6c474dd6b9-fgnch\" (UID: \"7b3e2383-a11b-4ef1-a2be-6d75f4b1babd\") " pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.270828 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd-serving-cert\") pod \"route-controller-manager-6c474dd6b9-fgnch\" (UID: \"7b3e2383-a11b-4ef1-a2be-6d75f4b1babd\") " pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.270876 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/937dac41-5afa-495a-9909-1152a419549c-utilities\") pod \"redhat-operators-lq4rb\" (UID: \"937dac41-5afa-495a-9909-1152a419549c\") " pod="openshift-marketplace/redhat-operators-lq4rb" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.270949 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d83d04c8-afe2-49f9-b0ab-f63d9f79544e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d83d04c8-afe2-49f9-b0ab-f63d9f79544e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.270975 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/937dac41-5afa-495a-9909-1152a419549c-catalog-content\") pod \"redhat-operators-lq4rb\" (UID: \"937dac41-5afa-495a-9909-1152a419549c\") " pod="openshift-marketplace/redhat-operators-lq4rb" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.272017 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d83d04c8-afe2-49f9-b0ab-f63d9f79544e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d83d04c8-afe2-49f9-b0ab-f63d9f79544e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.277324 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd-config\") pod \"route-controller-manager-6c474dd6b9-fgnch\" (UID: \"7b3e2383-a11b-4ef1-a2be-6d75f4b1babd\") " pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.282616 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd-client-ca\") pod \"route-controller-manager-6c474dd6b9-fgnch\" (UID: \"7b3e2383-a11b-4ef1-a2be-6d75f4b1babd\") " pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.296336 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d83d04c8-afe2-49f9-b0ab-f63d9f79544e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d83d04c8-afe2-49f9-b0ab-f63d9f79544e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.296833 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd-serving-cert\") pod \"route-controller-manager-6c474dd6b9-fgnch\" (UID: \"7b3e2383-a11b-4ef1-a2be-6d75f4b1babd\") " pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.310133 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjv6b\" (UniqueName: \"kubernetes.io/projected/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd-kube-api-access-rjv6b\") pod \"route-controller-manager-6c474dd6b9-fgnch\" (UID: \"7b3e2383-a11b-4ef1-a2be-6d75f4b1babd\") " pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.354799 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.367150 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.375528 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/937dac41-5afa-495a-9909-1152a419549c-catalog-content\") pod \"redhat-operators-lq4rb\" (UID: \"937dac41-5afa-495a-9909-1152a419549c\") " pod="openshift-marketplace/redhat-operators-lq4rb" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.375583 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlzhl\" (UniqueName: \"kubernetes.io/projected/937dac41-5afa-495a-9909-1152a419549c-kube-api-access-rlzhl\") pod \"redhat-operators-lq4rb\" (UID: \"937dac41-5afa-495a-9909-1152a419549c\") " pod="openshift-marketplace/redhat-operators-lq4rb" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.375639 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/937dac41-5afa-495a-9909-1152a419549c-utilities\") pod \"redhat-operators-lq4rb\" (UID: \"937dac41-5afa-495a-9909-1152a419549c\") " pod="openshift-marketplace/redhat-operators-lq4rb" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.376136 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/937dac41-5afa-495a-9909-1152a419549c-utilities\") pod \"redhat-operators-lq4rb\" (UID: \"937dac41-5afa-495a-9909-1152a419549c\") " pod="openshift-marketplace/redhat-operators-lq4rb" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.378552 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/937dac41-5afa-495a-9909-1152a419549c-catalog-content\") pod \"redhat-operators-lq4rb\" (UID: \"937dac41-5afa-495a-9909-1152a419549c\") " pod="openshift-marketplace/redhat-operators-lq4rb" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.397523 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlzhl\" (UniqueName: \"kubernetes.io/projected/937dac41-5afa-495a-9909-1152a419549c-kube-api-access-rlzhl\") pod \"redhat-operators-lq4rb\" (UID: \"937dac41-5afa-495a-9909-1152a419549c\") " pod="openshift-marketplace/redhat-operators-lq4rb" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.441196 4749 generic.go:334] "Generic (PLEG): container finished" podID="9c486dab-86dd-44dd-8c82-4c07ed84aa50" containerID="e277e7269daa7a860daf228a1a39f92b4b140989bc9efc5568d849d5d3baa18b" exitCode=0 Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.441613 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l72fj" event={"ID":"9c486dab-86dd-44dd-8c82-4c07ed84aa50","Type":"ContainerDied","Data":"e277e7269daa7a860daf228a1a39f92b4b140989bc9efc5568d849d5d3baa18b"} Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.441670 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l72fj" event={"ID":"9c486dab-86dd-44dd-8c82-4c07ed84aa50","Type":"ContainerStarted","Data":"8a082d7c577197e0e3943ef9dd3a66d7f42bd030c0e2c94bb748625bcd7a5460"} Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.449185 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" event={"ID":"473085e8-ee17-4244-abd0-dcf2308b4655","Type":"ContainerStarted","Data":"b80e21067c1aaa7790ff4209fbe44d9983502a44bcb10a8772fd02b4457779a8"} Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.449305 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.458965 4749 generic.go:334] "Generic (PLEG): container finished" podID="bd1003fd-4300-423c-b500-e782a8aeb7bb" containerID="fbbcf5253dbefa7a01d04ea62ce3369e448d83bba300dfaf0b5689eacc8c9959" exitCode=0 Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.459021 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.459045 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" event={"ID":"bd1003fd-4300-423c-b500-e782a8aeb7bb","Type":"ContainerDied","Data":"fbbcf5253dbefa7a01d04ea62ce3369e448d83bba300dfaf0b5689eacc8c9959"} Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.459070 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hvf29" event={"ID":"bd1003fd-4300-423c-b500-e782a8aeb7bb","Type":"ContainerDied","Data":"55cdec982b6e3f07b62ae4151f558f756ec1a953fdaf158d72963526ee19cf5d"} Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.459087 4749 scope.go:117] "RemoveContainer" containerID="fbbcf5253dbefa7a01d04ea62ce3369e448d83bba300dfaf0b5689eacc8c9959" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.478087 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lq4rb" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.496003 4749 generic.go:334] "Generic (PLEG): container finished" podID="cd2462fa-d077-4466-8930-6f2e69938c1b" containerID="a1a682f4a82cd43727b79d814e9dfa97798f75d71d9d6a9c3f23f09fdceb5da8" exitCode=0 Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.496329 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szw2w" event={"ID":"cd2462fa-d077-4466-8930-6f2e69938c1b","Type":"ContainerDied","Data":"a1a682f4a82cd43727b79d814e9dfa97798f75d71d9d6a9c3f23f09fdceb5da8"} Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.496380 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szw2w" event={"ID":"cd2462fa-d077-4466-8930-6f2e69938c1b","Type":"ContainerStarted","Data":"1bceb1f83fc2093b7e14b4c6ad527ca3b6aed59db619c4cd190ab0f3b9a635b2"} Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.520000 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" podStartSLOduration=158.519983534 podStartE2EDuration="2m38.519983534s" podCreationTimestamp="2026-03-20 07:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:19.491590715 +0000 UTC m=+216.041248362" watchObservedRunningTime="2026-03-20 07:16:19.519983534 +0000 UTC m=+216.069641171" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.536535 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-952x2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.536575 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-952x2" podUID="95e4555b-7f8b-4297-bed6-e0cf5e90ea3e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.537688 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-952x2 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.537708 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-952x2" podUID="95e4555b-7f8b-4297-bed6-e0cf5e90ea3e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.546507 4749 scope.go:117] "RemoveContainer" containerID="fbbcf5253dbefa7a01d04ea62ce3369e448d83bba300dfaf0b5689eacc8c9959" Mar 20 07:16:19 crc kubenswrapper[4749]: E0320 07:16:19.549995 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbbcf5253dbefa7a01d04ea62ce3369e448d83bba300dfaf0b5689eacc8c9959\": container with ID starting with fbbcf5253dbefa7a01d04ea62ce3369e448d83bba300dfaf0b5689eacc8c9959 not found: ID does not exist" containerID="fbbcf5253dbefa7a01d04ea62ce3369e448d83bba300dfaf0b5689eacc8c9959" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.550033 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbbcf5253dbefa7a01d04ea62ce3369e448d83bba300dfaf0b5689eacc8c9959"} err="failed to get container status \"fbbcf5253dbefa7a01d04ea62ce3369e448d83bba300dfaf0b5689eacc8c9959\": rpc error: code = NotFound desc = could not find container \"fbbcf5253dbefa7a01d04ea62ce3369e448d83bba300dfaf0b5689eacc8c9959\": container with ID starting with fbbcf5253dbefa7a01d04ea62ce3369e448d83bba300dfaf0b5689eacc8c9959 not found: ID does not exist" Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.588890 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hvf29"] Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.604444 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hvf29"] Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.628079 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m7xc9"] Mar 20 07:16:19 crc kubenswrapper[4749]: W0320 07:16:19.686997 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8faad596_00ed_4982_9f42_2f1a2465098c.slice/crio-4e75a4e346c83f8b735eff87be26225ab4e08f99ad40440332375250e08fc9a6 WatchSource:0}: Error finding container 4e75a4e346c83f8b735eff87be26225ab4e08f99ad40440332375250e08fc9a6: Status 404 returned error can't find the container with id 4e75a4e346c83f8b735eff87be26225ab4e08f99ad40440332375250e08fc9a6 Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.907034 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.921319 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.956368 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-dbs7t" Mar 20 07:16:19 crc kubenswrapper[4749]: W0320 07:16:19.956497 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd83d04c8_afe2_49f9_b0ab_f63d9f79544e.slice/crio-30373b71548fdac90de777a026a4e21d75377e3d0560d3664cc3bef1f2e5b0ac WatchSource:0}: Error finding container 30373b71548fdac90de777a026a4e21d75377e3d0560d3664cc3bef1f2e5b0ac: Status 404 returned error can't find the container with id 30373b71548fdac90de777a026a4e21d75377e3d0560d3664cc3bef1f2e5b0ac Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.959770 4749 patch_prober.go:28] interesting pod/router-default-5444994796-dbs7t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 07:16:19 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 20 07:16:19 crc kubenswrapper[4749]: [+]process-running ok Mar 20 07:16:19 crc kubenswrapper[4749]: healthz check failed Mar 20 07:16:19 crc kubenswrapper[4749]: I0320 07:16:19.959810 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dbs7t" podUID="da62d543-787a-4364-8271-8f8f9529dd0c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.000681 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lq4rb"] Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.008413 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch"] Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.016564 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6bfbf87665-k7ffm"] Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.017242 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.029896 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.029991 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.030001 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.030295 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.030361 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.031664 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.034893 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bfbf87665-k7ffm"] Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.045315 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 07:16:20 crc kubenswrapper[4749]: W0320 07:16:20.045670 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b3e2383_a11b_4ef1_a2be_6d75f4b1babd.slice/crio-cabb6f16f7765a735f96274e3398b4edf156a1cc7a8a2dd52cbdb37ab4c29631 WatchSource:0}: Error finding container cabb6f16f7765a735f96274e3398b4edf156a1cc7a8a2dd52cbdb37ab4c29631: Status 404 returned error can't find the container with id cabb6f16f7765a735f96274e3398b4edf156a1cc7a8a2dd52cbdb37ab4c29631 Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.090709 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97ac526c-98ac-4cb7-8e84-00b4f9808a06-client-ca\") pod \"controller-manager-6bfbf87665-k7ffm\" (UID: \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\") " pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.090942 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kmwx\" (UniqueName: \"kubernetes.io/projected/97ac526c-98ac-4cb7-8e84-00b4f9808a06-kube-api-access-4kmwx\") pod \"controller-manager-6bfbf87665-k7ffm\" (UID: \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\") " pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.090982 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ac526c-98ac-4cb7-8e84-00b4f9808a06-config\") pod \"controller-manager-6bfbf87665-k7ffm\" (UID: \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\") " pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.091165 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97ac526c-98ac-4cb7-8e84-00b4f9808a06-serving-cert\") pod \"controller-manager-6bfbf87665-k7ffm\" (UID: \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\") " pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.091201 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97ac526c-98ac-4cb7-8e84-00b4f9808a06-proxy-ca-bundles\") pod \"controller-manager-6bfbf87665-k7ffm\" (UID: \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\") " pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.191066 4749 patch_prober.go:28] interesting pod/console-f9d7485db-2zlqs container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.31:8443/health\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.191344 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-2zlqs" podUID="ff4ae6b4-eebc-4a32-b390-ec7ea70c8841" containerName="console" probeResult="failure" output="Get \"https://10.217.0.31:8443/health\": dial tcp 10.217.0.31:8443: connect: connection refused" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.191922 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97ac526c-98ac-4cb7-8e84-00b4f9808a06-client-ca\") pod \"controller-manager-6bfbf87665-k7ffm\" (UID: \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\") " pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.191970 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kmwx\" (UniqueName: \"kubernetes.io/projected/97ac526c-98ac-4cb7-8e84-00b4f9808a06-kube-api-access-4kmwx\") pod \"controller-manager-6bfbf87665-k7ffm\" (UID: \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\") " pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.191989 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ac526c-98ac-4cb7-8e84-00b4f9808a06-config\") pod \"controller-manager-6bfbf87665-k7ffm\" (UID: \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\") " pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.192044 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97ac526c-98ac-4cb7-8e84-00b4f9808a06-serving-cert\") pod \"controller-manager-6bfbf87665-k7ffm\" (UID: \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\") " pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.192068 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97ac526c-98ac-4cb7-8e84-00b4f9808a06-proxy-ca-bundles\") pod \"controller-manager-6bfbf87665-k7ffm\" (UID: \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\") " pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.194431 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97ac526c-98ac-4cb7-8e84-00b4f9808a06-proxy-ca-bundles\") pod \"controller-manager-6bfbf87665-k7ffm\" (UID: \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\") " pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.195522 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ac526c-98ac-4cb7-8e84-00b4f9808a06-config\") pod \"controller-manager-6bfbf87665-k7ffm\" (UID: \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\") " pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.195661 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97ac526c-98ac-4cb7-8e84-00b4f9808a06-client-ca\") pod \"controller-manager-6bfbf87665-k7ffm\" (UID: \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\") " pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.195874 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f9b9110-5f21-4d71-ac4e-61e0ff6b1899" path="/var/lib/kubelet/pods/4f9b9110-5f21-4d71-ac4e-61e0ff6b1899/volumes" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.196525 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd1003fd-4300-423c-b500-e782a8aeb7bb" path="/var/lib/kubelet/pods/bd1003fd-4300-423c-b500-e782a8aeb7bb/volumes" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.196927 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.196945 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.201442 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97ac526c-98ac-4cb7-8e84-00b4f9808a06-serving-cert\") pod \"controller-manager-6bfbf87665-k7ffm\" (UID: \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\") " pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.207838 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kmwx\" (UniqueName: \"kubernetes.io/projected/97ac526c-98ac-4cb7-8e84-00b4f9808a06-kube-api-access-4kmwx\") pod \"controller-manager-6bfbf87665-k7ffm\" (UID: \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\") " pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.358620 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.506134 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" event={"ID":"7b3e2383-a11b-4ef1-a2be-6d75f4b1babd","Type":"ContainerStarted","Data":"9750f87260ff033336215122e6ed2ab1737c8f199935f7ec5c1da9c327e058cc"} Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.506186 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" event={"ID":"7b3e2383-a11b-4ef1-a2be-6d75f4b1babd","Type":"ContainerStarted","Data":"cabb6f16f7765a735f96274e3398b4edf156a1cc7a8a2dd52cbdb37ab4c29631"} Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.507697 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.546988 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"aa5c3d7c-62dd-441e-a0e2-89e2eae56970","Type":"ContainerStarted","Data":"97b3a075ec8aea9a79b5fc81fdaba07adc443e6d518806cb0ac7e161e335823f"} Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.547128 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"aa5c3d7c-62dd-441e-a0e2-89e2eae56970","Type":"ContainerStarted","Data":"f4da6bd55a4430e1568f6e18cfc75f91774dec649918a5e3c2f026022082f9e2"} Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.548244 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" podStartSLOduration=3.5482184439999997 podStartE2EDuration="3.548218444s" podCreationTimestamp="2026-03-20 07:16:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:20.538271198 +0000 UTC m=+217.087928845" watchObservedRunningTime="2026-03-20 07:16:20.548218444 +0000 UTC m=+217.097876101" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.560389 4749 generic.go:334] "Generic (PLEG): container finished" podID="937dac41-5afa-495a-9909-1152a419549c" containerID="923ff68fbc0eeabcf3046d61cc602be29815801953a9f68cb5bd2df14a94ab7f" exitCode=0 Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.560486 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lq4rb" event={"ID":"937dac41-5afa-495a-9909-1152a419549c","Type":"ContainerDied","Data":"923ff68fbc0eeabcf3046d61cc602be29815801953a9f68cb5bd2df14a94ab7f"} Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.560531 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lq4rb" event={"ID":"937dac41-5afa-495a-9909-1152a419549c","Type":"ContainerStarted","Data":"808123ecdcdbc205854608f4e17e2661ef6666efba995198c0ef116a48cfaf1f"} Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.564559 4749 generic.go:334] "Generic (PLEG): container finished" podID="8faad596-00ed-4982-9f42-2f1a2465098c" containerID="a187deff2bddc293f5f4627656514e6ed7fb308965e742a6519be5199f2210ee" exitCode=0 Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.564668 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7xc9" event={"ID":"8faad596-00ed-4982-9f42-2f1a2465098c","Type":"ContainerDied","Data":"a187deff2bddc293f5f4627656514e6ed7fb308965e742a6519be5199f2210ee"} Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.564722 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7xc9" event={"ID":"8faad596-00ed-4982-9f42-2f1a2465098c","Type":"ContainerStarted","Data":"4e75a4e346c83f8b735eff87be26225ab4e08f99ad40440332375250e08fc9a6"} Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.574889 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.5748736389999998 podStartE2EDuration="2.574873639s" podCreationTimestamp="2026-03-20 07:16:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:20.57372366 +0000 UTC m=+217.123381307" watchObservedRunningTime="2026-03-20 07:16:20.574873639 +0000 UTC m=+217.124531286" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.583382 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d83d04c8-afe2-49f9-b0ab-f63d9f79544e","Type":"ContainerStarted","Data":"f0a8fde2639f6d5e6338f535b1f24aa6560f3a33d175d720decda9b6a924c5ac"} Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.583422 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d83d04c8-afe2-49f9-b0ab-f63d9f79544e","Type":"ContainerStarted","Data":"30373b71548fdac90de777a026a4e21d75377e3d0560d3664cc3bef1f2e5b0ac"} Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.614537 4749 generic.go:334] "Generic (PLEG): container finished" podID="2af8695f-a945-411d-ac95-03191fb3080d" containerID="04c94989401b3e171aa790f44e5e33a1bb2df54d0c87db557e5ced47d88d57d8" exitCode=0 Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.614702 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-6hw5x" event={"ID":"2af8695f-a945-411d-ac95-03191fb3080d","Type":"ContainerDied","Data":"04c94989401b3e171aa790f44e5e33a1bb2df54d0c87db557e5ced47d88d57d8"} Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.627580 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.627561533 podStartE2EDuration="1.627561533s" podCreationTimestamp="2026-03-20 07:16:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:20.623583951 +0000 UTC m=+217.173241608" watchObservedRunningTime="2026-03-20 07:16:20.627561533 +0000 UTC m=+217.177219180" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.762470 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.923648 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bfbf87665-k7ffm"] Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.959146 4749 patch_prober.go:28] interesting pod/router-default-5444994796-dbs7t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 07:16:20 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 20 07:16:20 crc kubenswrapper[4749]: [+]process-running ok Mar 20 07:16:20 crc kubenswrapper[4749]: healthz check failed Mar 20 07:16:20 crc kubenswrapper[4749]: I0320 07:16:20.959220 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dbs7t" podUID="da62d543-787a-4364-8271-8f8f9529dd0c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 07:16:20 crc kubenswrapper[4749]: W0320 07:16:20.978406 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97ac526c_98ac_4cb7_8e84_00b4f9808a06.slice/crio-651001f401166c72df2c1e526f7425029fc3bd945ef3fdd395563df958218e8d WatchSource:0}: Error finding container 651001f401166c72df2c1e526f7425029fc3bd945ef3fdd395563df958218e8d: Status 404 returned error can't find the container with id 651001f401166c72df2c1e526f7425029fc3bd945ef3fdd395563df958218e8d Mar 20 07:16:21 crc kubenswrapper[4749]: I0320 07:16:21.399457 4749 ???:1] "http: TLS handshake error from 192.168.126.11:44206: no serving certificate available for the kubelet" Mar 20 07:16:21 crc kubenswrapper[4749]: I0320 07:16:21.667906 4749 generic.go:334] "Generic (PLEG): container finished" podID="d83d04c8-afe2-49f9-b0ab-f63d9f79544e" containerID="f0a8fde2639f6d5e6338f535b1f24aa6560f3a33d175d720decda9b6a924c5ac" exitCode=0 Mar 20 07:16:21 crc kubenswrapper[4749]: I0320 07:16:21.668006 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d83d04c8-afe2-49f9-b0ab-f63d9f79544e","Type":"ContainerDied","Data":"f0a8fde2639f6d5e6338f535b1f24aa6560f3a33d175d720decda9b6a924c5ac"} Mar 20 07:16:21 crc kubenswrapper[4749]: I0320 07:16:21.676466 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" event={"ID":"97ac526c-98ac-4cb7-8e84-00b4f9808a06","Type":"ContainerStarted","Data":"32ae71a88ffe40e367c62e61dd4e8374d45e24169ad616ac80d6e8e69bbaa43b"} Mar 20 07:16:21 crc kubenswrapper[4749]: I0320 07:16:21.676510 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" event={"ID":"97ac526c-98ac-4cb7-8e84-00b4f9808a06","Type":"ContainerStarted","Data":"651001f401166c72df2c1e526f7425029fc3bd945ef3fdd395563df958218e8d"} Mar 20 07:16:21 crc kubenswrapper[4749]: I0320 07:16:21.677099 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" Mar 20 07:16:21 crc kubenswrapper[4749]: I0320 07:16:21.689222 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" Mar 20 07:16:21 crc kubenswrapper[4749]: I0320 07:16:21.692368 4749 generic.go:334] "Generic (PLEG): container finished" podID="aa5c3d7c-62dd-441e-a0e2-89e2eae56970" containerID="97b3a075ec8aea9a79b5fc81fdaba07adc443e6d518806cb0ac7e161e335823f" exitCode=0 Mar 20 07:16:21 crc kubenswrapper[4749]: I0320 07:16:21.692449 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"aa5c3d7c-62dd-441e-a0e2-89e2eae56970","Type":"ContainerDied","Data":"97b3a075ec8aea9a79b5fc81fdaba07adc443e6d518806cb0ac7e161e335823f"} Mar 20 07:16:21 crc kubenswrapper[4749]: I0320 07:16:21.706563 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" podStartSLOduration=3.706548718 podStartE2EDuration="3.706548718s" podCreationTimestamp="2026-03-20 07:16:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:21.70468413 +0000 UTC m=+218.254341777" watchObservedRunningTime="2026-03-20 07:16:21.706548718 +0000 UTC m=+218.256206355" Mar 20 07:16:21 crc kubenswrapper[4749]: I0320 07:16:21.964210 4749 patch_prober.go:28] interesting pod/router-default-5444994796-dbs7t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 07:16:21 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 20 07:16:21 crc kubenswrapper[4749]: [+]process-running ok Mar 20 07:16:21 crc kubenswrapper[4749]: healthz check failed Mar 20 07:16:21 crc kubenswrapper[4749]: I0320 07:16:21.964638 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dbs7t" podUID="da62d543-787a-4364-8271-8f8f9529dd0c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 07:16:21 crc kubenswrapper[4749]: I0320 07:16:21.998714 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-6hw5x" Mar 20 07:16:22 crc kubenswrapper[4749]: I0320 07:16:22.035170 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2af8695f-a945-411d-ac95-03191fb3080d-config-volume\") pod \"2af8695f-a945-411d-ac95-03191fb3080d\" (UID: \"2af8695f-a945-411d-ac95-03191fb3080d\") " Mar 20 07:16:22 crc kubenswrapper[4749]: I0320 07:16:22.035345 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq27l\" (UniqueName: \"kubernetes.io/projected/2af8695f-a945-411d-ac95-03191fb3080d-kube-api-access-vq27l\") pod \"2af8695f-a945-411d-ac95-03191fb3080d\" (UID: \"2af8695f-a945-411d-ac95-03191fb3080d\") " Mar 20 07:16:22 crc kubenswrapper[4749]: I0320 07:16:22.035402 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2af8695f-a945-411d-ac95-03191fb3080d-secret-volume\") pod \"2af8695f-a945-411d-ac95-03191fb3080d\" (UID: \"2af8695f-a945-411d-ac95-03191fb3080d\") " Mar 20 07:16:22 crc kubenswrapper[4749]: I0320 07:16:22.043444 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2af8695f-a945-411d-ac95-03191fb3080d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2af8695f-a945-411d-ac95-03191fb3080d" (UID: "2af8695f-a945-411d-ac95-03191fb3080d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:16:22 crc kubenswrapper[4749]: I0320 07:16:22.044166 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af8695f-a945-411d-ac95-03191fb3080d-config-volume" (OuterVolumeSpecName: "config-volume") pod "2af8695f-a945-411d-ac95-03191fb3080d" (UID: "2af8695f-a945-411d-ac95-03191fb3080d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:16:22 crc kubenswrapper[4749]: I0320 07:16:22.065060 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af8695f-a945-411d-ac95-03191fb3080d-kube-api-access-vq27l" (OuterVolumeSpecName: "kube-api-access-vq27l") pod "2af8695f-a945-411d-ac95-03191fb3080d" (UID: "2af8695f-a945-411d-ac95-03191fb3080d"). InnerVolumeSpecName "kube-api-access-vq27l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:16:22 crc kubenswrapper[4749]: I0320 07:16:22.137174 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq27l\" (UniqueName: \"kubernetes.io/projected/2af8695f-a945-411d-ac95-03191fb3080d-kube-api-access-vq27l\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:22 crc kubenswrapper[4749]: I0320 07:16:22.137221 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2af8695f-a945-411d-ac95-03191fb3080d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:22 crc kubenswrapper[4749]: I0320 07:16:22.137234 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2af8695f-a945-411d-ac95-03191fb3080d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:22 crc kubenswrapper[4749]: I0320 07:16:22.711771 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-6hw5x" Mar 20 07:16:22 crc kubenswrapper[4749]: I0320 07:16:22.712385 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566515-6hw5x" event={"ID":"2af8695f-a945-411d-ac95-03191fb3080d","Type":"ContainerDied","Data":"bb4595cd35635888e5c9123be97d5d56c7b048c9b161f766c504cd64942184ed"} Mar 20 07:16:22 crc kubenswrapper[4749]: I0320 07:16:22.712413 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb4595cd35635888e5c9123be97d5d56c7b048c9b161f766c504cd64942184ed" Mar 20 07:16:22 crc kubenswrapper[4749]: I0320 07:16:22.956544 4749 patch_prober.go:28] interesting pod/router-default-5444994796-dbs7t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 07:16:22 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 20 07:16:22 crc kubenswrapper[4749]: [+]process-running ok Mar 20 07:16:22 crc kubenswrapper[4749]: healthz check failed Mar 20 07:16:22 crc kubenswrapper[4749]: I0320 07:16:22.956792 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dbs7t" podUID="da62d543-787a-4364-8271-8f8f9529dd0c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 07:16:23 crc kubenswrapper[4749]: I0320 07:16:23.008606 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 07:16:23 crc kubenswrapper[4749]: I0320 07:16:23.054465 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa5c3d7c-62dd-441e-a0e2-89e2eae56970-kube-api-access\") pod \"aa5c3d7c-62dd-441e-a0e2-89e2eae56970\" (UID: \"aa5c3d7c-62dd-441e-a0e2-89e2eae56970\") " Mar 20 07:16:23 crc kubenswrapper[4749]: I0320 07:16:23.054509 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aa5c3d7c-62dd-441e-a0e2-89e2eae56970-kubelet-dir\") pod \"aa5c3d7c-62dd-441e-a0e2-89e2eae56970\" (UID: \"aa5c3d7c-62dd-441e-a0e2-89e2eae56970\") " Mar 20 07:16:23 crc kubenswrapper[4749]: I0320 07:16:23.054812 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aa5c3d7c-62dd-441e-a0e2-89e2eae56970-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "aa5c3d7c-62dd-441e-a0e2-89e2eae56970" (UID: "aa5c3d7c-62dd-441e-a0e2-89e2eae56970"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:16:23 crc kubenswrapper[4749]: I0320 07:16:23.058565 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa5c3d7c-62dd-441e-a0e2-89e2eae56970-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "aa5c3d7c-62dd-441e-a0e2-89e2eae56970" (UID: "aa5c3d7c-62dd-441e-a0e2-89e2eae56970"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:16:23 crc kubenswrapper[4749]: I0320 07:16:23.150032 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 07:16:23 crc kubenswrapper[4749]: I0320 07:16:23.156140 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa5c3d7c-62dd-441e-a0e2-89e2eae56970-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:23 crc kubenswrapper[4749]: I0320 07:16:23.156206 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aa5c3d7c-62dd-441e-a0e2-89e2eae56970-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:23 crc kubenswrapper[4749]: I0320 07:16:23.257076 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d83d04c8-afe2-49f9-b0ab-f63d9f79544e-kubelet-dir\") pod \"d83d04c8-afe2-49f9-b0ab-f63d9f79544e\" (UID: \"d83d04c8-afe2-49f9-b0ab-f63d9f79544e\") " Mar 20 07:16:23 crc kubenswrapper[4749]: I0320 07:16:23.257186 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d83d04c8-afe2-49f9-b0ab-f63d9f79544e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d83d04c8-afe2-49f9-b0ab-f63d9f79544e" (UID: "d83d04c8-afe2-49f9-b0ab-f63d9f79544e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:16:23 crc kubenswrapper[4749]: I0320 07:16:23.258006 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d83d04c8-afe2-49f9-b0ab-f63d9f79544e-kube-api-access\") pod \"d83d04c8-afe2-49f9-b0ab-f63d9f79544e\" (UID: \"d83d04c8-afe2-49f9-b0ab-f63d9f79544e\") " Mar 20 07:16:23 crc kubenswrapper[4749]: I0320 07:16:23.258419 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d83d04c8-afe2-49f9-b0ab-f63d9f79544e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:23 crc kubenswrapper[4749]: I0320 07:16:23.261197 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d83d04c8-afe2-49f9-b0ab-f63d9f79544e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d83d04c8-afe2-49f9-b0ab-f63d9f79544e" (UID: "d83d04c8-afe2-49f9-b0ab-f63d9f79544e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:16:23 crc kubenswrapper[4749]: I0320 07:16:23.360487 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d83d04c8-afe2-49f9-b0ab-f63d9f79544e-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:23 crc kubenswrapper[4749]: I0320 07:16:23.791651 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"aa5c3d7c-62dd-441e-a0e2-89e2eae56970","Type":"ContainerDied","Data":"f4da6bd55a4430e1568f6e18cfc75f91774dec649918a5e3c2f026022082f9e2"} Mar 20 07:16:23 crc kubenswrapper[4749]: I0320 07:16:23.791706 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4da6bd55a4430e1568f6e18cfc75f91774dec649918a5e3c2f026022082f9e2" Mar 20 07:16:23 crc kubenswrapper[4749]: I0320 07:16:23.792019 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 07:16:23 crc kubenswrapper[4749]: I0320 07:16:23.798762 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d83d04c8-afe2-49f9-b0ab-f63d9f79544e","Type":"ContainerDied","Data":"30373b71548fdac90de777a026a4e21d75377e3d0560d3664cc3bef1f2e5b0ac"} Mar 20 07:16:23 crc kubenswrapper[4749]: I0320 07:16:23.798821 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 07:16:23 crc kubenswrapper[4749]: I0320 07:16:23.798827 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30373b71548fdac90de777a026a4e21d75377e3d0560d3664cc3bef1f2e5b0ac" Mar 20 07:16:23 crc kubenswrapper[4749]: I0320 07:16:23.958391 4749 patch_prober.go:28] interesting pod/router-default-5444994796-dbs7t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 07:16:23 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 20 07:16:23 crc kubenswrapper[4749]: [+]process-running ok Mar 20 07:16:23 crc kubenswrapper[4749]: healthz check failed Mar 20 07:16:23 crc kubenswrapper[4749]: I0320 07:16:23.958448 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dbs7t" podUID="da62d543-787a-4364-8271-8f8f9529dd0c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 07:16:24 crc kubenswrapper[4749]: I0320 07:16:24.433165 4749 ???:1] "http: TLS handshake error from 192.168.126.11:44208: no serving certificate available for the kubelet" Mar 20 07:16:24 crc kubenswrapper[4749]: I0320 07:16:24.957112 4749 patch_prober.go:28] interesting pod/router-default-5444994796-dbs7t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 07:16:24 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 20 07:16:24 crc kubenswrapper[4749]: [+]process-running ok Mar 20 07:16:24 crc kubenswrapper[4749]: healthz check failed Mar 20 07:16:24 crc kubenswrapper[4749]: I0320 07:16:24.957178 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dbs7t" podUID="da62d543-787a-4364-8271-8f8f9529dd0c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 07:16:25 crc kubenswrapper[4749]: I0320 07:16:25.494864 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-t8g8d" Mar 20 07:16:25 crc kubenswrapper[4749]: I0320 07:16:25.956868 4749 patch_prober.go:28] interesting pod/router-default-5444994796-dbs7t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 07:16:25 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 20 07:16:25 crc kubenswrapper[4749]: [+]process-running ok Mar 20 07:16:25 crc kubenswrapper[4749]: healthz check failed Mar 20 07:16:25 crc kubenswrapper[4749]: I0320 07:16:25.956913 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dbs7t" podUID="da62d543-787a-4364-8271-8f8f9529dd0c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 07:16:26 crc kubenswrapper[4749]: I0320 07:16:26.551209 4749 ???:1] "http: TLS handshake error from 192.168.126.11:44224: no serving certificate available for the kubelet" Mar 20 07:16:26 crc kubenswrapper[4749]: I0320 07:16:26.956460 4749 patch_prober.go:28] interesting pod/router-default-5444994796-dbs7t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 07:16:26 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 20 07:16:26 crc kubenswrapper[4749]: [+]process-running ok Mar 20 07:16:26 crc kubenswrapper[4749]: healthz check failed Mar 20 07:16:26 crc kubenswrapper[4749]: I0320 07:16:26.956767 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dbs7t" podUID="da62d543-787a-4364-8271-8f8f9529dd0c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 07:16:27 crc kubenswrapper[4749]: I0320 07:16:27.957981 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-dbs7t" Mar 20 07:16:27 crc kubenswrapper[4749]: I0320 07:16:27.960233 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-dbs7t" Mar 20 07:16:29 crc kubenswrapper[4749]: I0320 07:16:29.530975 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-952x2" Mar 20 07:16:30 crc kubenswrapper[4749]: I0320 07:16:30.192839 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:30 crc kubenswrapper[4749]: I0320 07:16:30.198387 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:16:34 crc kubenswrapper[4749]: I0320 07:16:34.514777 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:16:34 crc kubenswrapper[4749]: I0320 07:16:34.515448 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:16:36 crc kubenswrapper[4749]: I0320 07:16:36.628498 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bfbf87665-k7ffm"] Mar 20 07:16:36 crc kubenswrapper[4749]: I0320 07:16:36.628701 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" podUID="97ac526c-98ac-4cb7-8e84-00b4f9808a06" containerName="controller-manager" containerID="cri-o://32ae71a88ffe40e367c62e61dd4e8374d45e24169ad616ac80d6e8e69bbaa43b" gracePeriod=30 Mar 20 07:16:36 crc kubenswrapper[4749]: I0320 07:16:36.637993 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch"] Mar 20 07:16:36 crc kubenswrapper[4749]: I0320 07:16:36.638576 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" podUID="7b3e2383-a11b-4ef1-a2be-6d75f4b1babd" containerName="route-controller-manager" containerID="cri-o://9750f87260ff033336215122e6ed2ab1737c8f199935f7ec5c1da9c327e058cc" gracePeriod=30 Mar 20 07:16:36 crc kubenswrapper[4749]: I0320 07:16:36.884486 4749 generic.go:334] "Generic (PLEG): container finished" podID="97ac526c-98ac-4cb7-8e84-00b4f9808a06" containerID="32ae71a88ffe40e367c62e61dd4e8374d45e24169ad616ac80d6e8e69bbaa43b" exitCode=0 Mar 20 07:16:36 crc kubenswrapper[4749]: I0320 07:16:36.884554 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" event={"ID":"97ac526c-98ac-4cb7-8e84-00b4f9808a06","Type":"ContainerDied","Data":"32ae71a88ffe40e367c62e61dd4e8374d45e24169ad616ac80d6e8e69bbaa43b"} Mar 20 07:16:36 crc kubenswrapper[4749]: I0320 07:16:36.887670 4749 generic.go:334] "Generic (PLEG): container finished" podID="7b3e2383-a11b-4ef1-a2be-6d75f4b1babd" containerID="9750f87260ff033336215122e6ed2ab1737c8f199935f7ec5c1da9c327e058cc" exitCode=0 Mar 20 07:16:36 crc kubenswrapper[4749]: I0320 07:16:36.887694 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" event={"ID":"7b3e2383-a11b-4ef1-a2be-6d75f4b1babd","Type":"ContainerDied","Data":"9750f87260ff033336215122e6ed2ab1737c8f199935f7ec5c1da9c327e058cc"} Mar 20 07:16:38 crc kubenswrapper[4749]: I0320 07:16:38.003496 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:16:39 crc kubenswrapper[4749]: I0320 07:16:39.356168 4749 patch_prober.go:28] interesting pod/route-controller-manager-6c474dd6b9-fgnch container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Mar 20 07:16:39 crc kubenswrapper[4749]: I0320 07:16:39.356218 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" podUID="7b3e2383-a11b-4ef1-a2be-6d75f4b1babd" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Mar 20 07:16:40 crc kubenswrapper[4749]: I0320 07:16:40.360426 4749 patch_prober.go:28] interesting pod/controller-manager-6bfbf87665-k7ffm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" start-of-body= Mar 20 07:16:40 crc kubenswrapper[4749]: I0320 07:16:40.360528 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" podUID="97ac526c-98ac-4cb7-8e84-00b4f9808a06" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" Mar 20 07:16:41 crc kubenswrapper[4749]: E0320 07:16:41.477648 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 20 07:16:41 crc kubenswrapper[4749]: E0320 07:16:41.478066 4749 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 07:16:41 crc kubenswrapper[4749]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 20 07:16:41 crc kubenswrapper[4749]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r7xkd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29566516-6stbk_openshift-infra(da95cd86-f90a-4d7f-a308-4124b22d8427): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 20 07:16:41 crc kubenswrapper[4749]: > logger="UnhandledError" Mar 20 07:16:41 crc kubenswrapper[4749]: E0320 07:16:41.479244 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29566516-6stbk" podUID="da95cd86-f90a-4d7f-a308-4124b22d8427" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.488163 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.509444 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l"] Mar 20 07:16:41 crc kubenswrapper[4749]: E0320 07:16:41.509627 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d83d04c8-afe2-49f9-b0ab-f63d9f79544e" containerName="pruner" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.509637 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d83d04c8-afe2-49f9-b0ab-f63d9f79544e" containerName="pruner" Mar 20 07:16:41 crc kubenswrapper[4749]: E0320 07:16:41.509648 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5c3d7c-62dd-441e-a0e2-89e2eae56970" containerName="pruner" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.509654 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5c3d7c-62dd-441e-a0e2-89e2eae56970" containerName="pruner" Mar 20 07:16:41 crc kubenswrapper[4749]: E0320 07:16:41.509664 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b3e2383-a11b-4ef1-a2be-6d75f4b1babd" containerName="route-controller-manager" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.509670 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b3e2383-a11b-4ef1-a2be-6d75f4b1babd" containerName="route-controller-manager" Mar 20 07:16:41 crc kubenswrapper[4749]: E0320 07:16:41.509681 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af8695f-a945-411d-ac95-03191fb3080d" containerName="collect-profiles" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.509687 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af8695f-a945-411d-ac95-03191fb3080d" containerName="collect-profiles" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.509812 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b3e2383-a11b-4ef1-a2be-6d75f4b1babd" containerName="route-controller-manager" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.509822 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d83d04c8-afe2-49f9-b0ab-f63d9f79544e" containerName="pruner" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.509830 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af8695f-a945-411d-ac95-03191fb3080d" containerName="collect-profiles" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.509839 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa5c3d7c-62dd-441e-a0e2-89e2eae56970" containerName="pruner" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.510168 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.572083 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd-serving-cert\") pod \"7b3e2383-a11b-4ef1-a2be-6d75f4b1babd\" (UID: \"7b3e2383-a11b-4ef1-a2be-6d75f4b1babd\") " Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.572407 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd-config\") pod \"7b3e2383-a11b-4ef1-a2be-6d75f4b1babd\" (UID: \"7b3e2383-a11b-4ef1-a2be-6d75f4b1babd\") " Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.572436 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd-client-ca\") pod \"7b3e2383-a11b-4ef1-a2be-6d75f4b1babd\" (UID: \"7b3e2383-a11b-4ef1-a2be-6d75f4b1babd\") " Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.572505 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjv6b\" (UniqueName: \"kubernetes.io/projected/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd-kube-api-access-rjv6b\") pod \"7b3e2383-a11b-4ef1-a2be-6d75f4b1babd\" (UID: \"7b3e2383-a11b-4ef1-a2be-6d75f4b1babd\") " Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.573457 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd-config" (OuterVolumeSpecName: "config") pod "7b3e2383-a11b-4ef1-a2be-6d75f4b1babd" (UID: "7b3e2383-a11b-4ef1-a2be-6d75f4b1babd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.573770 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd-client-ca" (OuterVolumeSpecName: "client-ca") pod "7b3e2383-a11b-4ef1-a2be-6d75f4b1babd" (UID: "7b3e2383-a11b-4ef1-a2be-6d75f4b1babd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.578643 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7b3e2383-a11b-4ef1-a2be-6d75f4b1babd" (UID: "7b3e2383-a11b-4ef1-a2be-6d75f4b1babd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.580072 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd-kube-api-access-rjv6b" (OuterVolumeSpecName: "kube-api-access-rjv6b") pod "7b3e2383-a11b-4ef1-a2be-6d75f4b1babd" (UID: "7b3e2383-a11b-4ef1-a2be-6d75f4b1babd"). InnerVolumeSpecName "kube-api-access-rjv6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.590099 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l"] Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.674122 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cd57b81-ff84-4d13-902f-36d9368d7421-serving-cert\") pod \"route-controller-manager-5787f8cc8-vkt6l\" (UID: \"0cd57b81-ff84-4d13-902f-36d9368d7421\") " pod="openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.674372 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cd57b81-ff84-4d13-902f-36d9368d7421-client-ca\") pod \"route-controller-manager-5787f8cc8-vkt6l\" (UID: \"0cd57b81-ff84-4d13-902f-36d9368d7421\") " pod="openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.674607 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cd57b81-ff84-4d13-902f-36d9368d7421-config\") pod \"route-controller-manager-5787f8cc8-vkt6l\" (UID: \"0cd57b81-ff84-4d13-902f-36d9368d7421\") " pod="openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.674840 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j45cf\" (UniqueName: \"kubernetes.io/projected/0cd57b81-ff84-4d13-902f-36d9368d7421-kube-api-access-j45cf\") pod \"route-controller-manager-5787f8cc8-vkt6l\" (UID: \"0cd57b81-ff84-4d13-902f-36d9368d7421\") " pod="openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.675129 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.675204 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.675226 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjv6b\" (UniqueName: \"kubernetes.io/projected/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd-kube-api-access-rjv6b\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.675242 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.776361 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j45cf\" (UniqueName: \"kubernetes.io/projected/0cd57b81-ff84-4d13-902f-36d9368d7421-kube-api-access-j45cf\") pod \"route-controller-manager-5787f8cc8-vkt6l\" (UID: \"0cd57b81-ff84-4d13-902f-36d9368d7421\") " pod="openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.776459 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cd57b81-ff84-4d13-902f-36d9368d7421-serving-cert\") pod \"route-controller-manager-5787f8cc8-vkt6l\" (UID: \"0cd57b81-ff84-4d13-902f-36d9368d7421\") " pod="openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.776510 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cd57b81-ff84-4d13-902f-36d9368d7421-client-ca\") pod \"route-controller-manager-5787f8cc8-vkt6l\" (UID: \"0cd57b81-ff84-4d13-902f-36d9368d7421\") " pod="openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.776553 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cd57b81-ff84-4d13-902f-36d9368d7421-config\") pod \"route-controller-manager-5787f8cc8-vkt6l\" (UID: \"0cd57b81-ff84-4d13-902f-36d9368d7421\") " pod="openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.778269 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cd57b81-ff84-4d13-902f-36d9368d7421-client-ca\") pod \"route-controller-manager-5787f8cc8-vkt6l\" (UID: \"0cd57b81-ff84-4d13-902f-36d9368d7421\") " pod="openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.780139 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cd57b81-ff84-4d13-902f-36d9368d7421-config\") pod \"route-controller-manager-5787f8cc8-vkt6l\" (UID: \"0cd57b81-ff84-4d13-902f-36d9368d7421\") " pod="openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.795134 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cd57b81-ff84-4d13-902f-36d9368d7421-serving-cert\") pod \"route-controller-manager-5787f8cc8-vkt6l\" (UID: \"0cd57b81-ff84-4d13-902f-36d9368d7421\") " pod="openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.798182 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j45cf\" (UniqueName: \"kubernetes.io/projected/0cd57b81-ff84-4d13-902f-36d9368d7421-kube-api-access-j45cf\") pod \"route-controller-manager-5787f8cc8-vkt6l\" (UID: \"0cd57b81-ff84-4d13-902f-36d9368d7421\") " pod="openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.911757 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.920578 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" event={"ID":"7b3e2383-a11b-4ef1-a2be-6d75f4b1babd","Type":"ContainerDied","Data":"cabb6f16f7765a735f96274e3398b4edf156a1cc7a8a2dd52cbdb37ab4c29631"} Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.920634 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.920646 4749 scope.go:117] "RemoveContainer" containerID="9750f87260ff033336215122e6ed2ab1737c8f199935f7ec5c1da9c327e058cc" Mar 20 07:16:41 crc kubenswrapper[4749]: E0320 07:16:41.921558 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29566516-6stbk" podUID="da95cd86-f90a-4d7f-a308-4124b22d8427" Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.962680 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch"] Mar 20 07:16:41 crc kubenswrapper[4749]: I0320 07:16:41.965573 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c474dd6b9-fgnch"] Mar 20 07:16:42 crc kubenswrapper[4749]: I0320 07:16:42.184821 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b3e2383-a11b-4ef1-a2be-6d75f4b1babd" path="/var/lib/kubelet/pods/7b3e2383-a11b-4ef1-a2be-6d75f4b1babd/volumes" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.320660 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.353911 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8d8fb768b-2zhpc"] Mar 20 07:16:44 crc kubenswrapper[4749]: E0320 07:16:44.354414 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ac526c-98ac-4cb7-8e84-00b4f9808a06" containerName="controller-manager" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.354428 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ac526c-98ac-4cb7-8e84-00b4f9808a06" containerName="controller-manager" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.354545 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="97ac526c-98ac-4cb7-8e84-00b4f9808a06" containerName="controller-manager" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.355001 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.360221 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8d8fb768b-2zhpc"] Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.408722 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kmwx\" (UniqueName: \"kubernetes.io/projected/97ac526c-98ac-4cb7-8e84-00b4f9808a06-kube-api-access-4kmwx\") pod \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\" (UID: \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\") " Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.408765 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97ac526c-98ac-4cb7-8e84-00b4f9808a06-client-ca\") pod \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\" (UID: \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\") " Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.408812 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97ac526c-98ac-4cb7-8e84-00b4f9808a06-serving-cert\") pod \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\" (UID: \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\") " Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.408842 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ac526c-98ac-4cb7-8e84-00b4f9808a06-config\") pod \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\" (UID: \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\") " Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.408923 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97ac526c-98ac-4cb7-8e84-00b4f9808a06-proxy-ca-bundles\") pod \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\" (UID: \"97ac526c-98ac-4cb7-8e84-00b4f9808a06\") " Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.409968 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ac526c-98ac-4cb7-8e84-00b4f9808a06-client-ca" (OuterVolumeSpecName: "client-ca") pod "97ac526c-98ac-4cb7-8e84-00b4f9808a06" (UID: "97ac526c-98ac-4cb7-8e84-00b4f9808a06"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.409984 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ac526c-98ac-4cb7-8e84-00b4f9808a06-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "97ac526c-98ac-4cb7-8e84-00b4f9808a06" (UID: "97ac526c-98ac-4cb7-8e84-00b4f9808a06"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.410490 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ac526c-98ac-4cb7-8e84-00b4f9808a06-config" (OuterVolumeSpecName: "config") pod "97ac526c-98ac-4cb7-8e84-00b4f9808a06" (UID: "97ac526c-98ac-4cb7-8e84-00b4f9808a06"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.414635 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97ac526c-98ac-4cb7-8e84-00b4f9808a06-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "97ac526c-98ac-4cb7-8e84-00b4f9808a06" (UID: "97ac526c-98ac-4cb7-8e84-00b4f9808a06"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.414955 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97ac526c-98ac-4cb7-8e84-00b4f9808a06-kube-api-access-4kmwx" (OuterVolumeSpecName: "kube-api-access-4kmwx") pod "97ac526c-98ac-4cb7-8e84-00b4f9808a06" (UID: "97ac526c-98ac-4cb7-8e84-00b4f9808a06"). InnerVolumeSpecName "kube-api-access-4kmwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.510397 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a24c70a4-ec51-459d-a464-65dd450c2366-client-ca\") pod \"controller-manager-8d8fb768b-2zhpc\" (UID: \"a24c70a4-ec51-459d-a464-65dd450c2366\") " pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.510442 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a24c70a4-ec51-459d-a464-65dd450c2366-proxy-ca-bundles\") pod \"controller-manager-8d8fb768b-2zhpc\" (UID: \"a24c70a4-ec51-459d-a464-65dd450c2366\") " pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.510468 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v66qh\" (UniqueName: \"kubernetes.io/projected/a24c70a4-ec51-459d-a464-65dd450c2366-kube-api-access-v66qh\") pod \"controller-manager-8d8fb768b-2zhpc\" (UID: \"a24c70a4-ec51-459d-a464-65dd450c2366\") " pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.510508 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a24c70a4-ec51-459d-a464-65dd450c2366-serving-cert\") pod \"controller-manager-8d8fb768b-2zhpc\" (UID: \"a24c70a4-ec51-459d-a464-65dd450c2366\") " pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.510550 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a24c70a4-ec51-459d-a464-65dd450c2366-config\") pod \"controller-manager-8d8fb768b-2zhpc\" (UID: \"a24c70a4-ec51-459d-a464-65dd450c2366\") " pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.510589 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kmwx\" (UniqueName: \"kubernetes.io/projected/97ac526c-98ac-4cb7-8e84-00b4f9808a06-kube-api-access-4kmwx\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.510600 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97ac526c-98ac-4cb7-8e84-00b4f9808a06-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.510609 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97ac526c-98ac-4cb7-8e84-00b4f9808a06-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.510618 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ac526c-98ac-4cb7-8e84-00b4f9808a06-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.510626 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97ac526c-98ac-4cb7-8e84-00b4f9808a06-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.611758 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a24c70a4-ec51-459d-a464-65dd450c2366-serving-cert\") pod \"controller-manager-8d8fb768b-2zhpc\" (UID: \"a24c70a4-ec51-459d-a464-65dd450c2366\") " pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.611794 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a24c70a4-ec51-459d-a464-65dd450c2366-config\") pod \"controller-manager-8d8fb768b-2zhpc\" (UID: \"a24c70a4-ec51-459d-a464-65dd450c2366\") " pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.611835 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a24c70a4-ec51-459d-a464-65dd450c2366-client-ca\") pod \"controller-manager-8d8fb768b-2zhpc\" (UID: \"a24c70a4-ec51-459d-a464-65dd450c2366\") " pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.611856 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a24c70a4-ec51-459d-a464-65dd450c2366-proxy-ca-bundles\") pod \"controller-manager-8d8fb768b-2zhpc\" (UID: \"a24c70a4-ec51-459d-a464-65dd450c2366\") " pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.611881 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v66qh\" (UniqueName: \"kubernetes.io/projected/a24c70a4-ec51-459d-a464-65dd450c2366-kube-api-access-v66qh\") pod \"controller-manager-8d8fb768b-2zhpc\" (UID: \"a24c70a4-ec51-459d-a464-65dd450c2366\") " pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.613261 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a24c70a4-ec51-459d-a464-65dd450c2366-client-ca\") pod \"controller-manager-8d8fb768b-2zhpc\" (UID: \"a24c70a4-ec51-459d-a464-65dd450c2366\") " pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.614030 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a24c70a4-ec51-459d-a464-65dd450c2366-config\") pod \"controller-manager-8d8fb768b-2zhpc\" (UID: \"a24c70a4-ec51-459d-a464-65dd450c2366\") " pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.614454 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a24c70a4-ec51-459d-a464-65dd450c2366-proxy-ca-bundles\") pod \"controller-manager-8d8fb768b-2zhpc\" (UID: \"a24c70a4-ec51-459d-a464-65dd450c2366\") " pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.615037 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a24c70a4-ec51-459d-a464-65dd450c2366-serving-cert\") pod \"controller-manager-8d8fb768b-2zhpc\" (UID: \"a24c70a4-ec51-459d-a464-65dd450c2366\") " pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.627343 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v66qh\" (UniqueName: \"kubernetes.io/projected/a24c70a4-ec51-459d-a464-65dd450c2366-kube-api-access-v66qh\") pod \"controller-manager-8d8fb768b-2zhpc\" (UID: \"a24c70a4-ec51-459d-a464-65dd450c2366\") " pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.675104 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.936612 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" event={"ID":"97ac526c-98ac-4cb7-8e84-00b4f9808a06","Type":"ContainerDied","Data":"651001f401166c72df2c1e526f7425029fc3bd945ef3fdd395563df958218e8d"} Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.936651 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bfbf87665-k7ffm" Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.963328 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bfbf87665-k7ffm"] Mar 20 07:16:44 crc kubenswrapper[4749]: I0320 07:16:44.975502 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6bfbf87665-k7ffm"] Mar 20 07:16:46 crc kubenswrapper[4749]: I0320 07:16:46.186873 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97ac526c-98ac-4cb7-8e84-00b4f9808a06" path="/var/lib/kubelet/pods/97ac526c-98ac-4cb7-8e84-00b4f9808a06/volumes" Mar 20 07:16:47 crc kubenswrapper[4749]: E0320 07:16:47.058442 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 07:16:47 crc kubenswrapper[4749]: E0320 07:16:47.058604 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fgp9z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-rv5h9_openshift-marketplace(b7e5d15e-f3f5-4595-be01-ae4f196285ad): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 07:16:47 crc kubenswrapper[4749]: I0320 07:16:47.058832 4749 ???:1] "http: TLS handshake error from 192.168.126.11:60180: no serving certificate available for the kubelet" Mar 20 07:16:47 crc kubenswrapper[4749]: E0320 07:16:47.060031 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-rv5h9" podUID="b7e5d15e-f3f5-4595-be01-ae4f196285ad" Mar 20 07:16:47 crc kubenswrapper[4749]: E0320 07:16:47.091791 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 07:16:47 crc kubenswrapper[4749]: E0320 07:16:47.092181 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hk8jn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-x9swj_openshift-marketplace(a22f47dc-59ce-4cce-821c-508fc14a9508): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 07:16:47 crc kubenswrapper[4749]: E0320 07:16:47.093426 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-x9swj" podUID="a22f47dc-59ce-4cce-821c-508fc14a9508" Mar 20 07:16:48 crc kubenswrapper[4749]: E0320 07:16:48.614690 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-rv5h9" podUID="b7e5d15e-f3f5-4595-be01-ae4f196285ad" Mar 20 07:16:48 crc kubenswrapper[4749]: E0320 07:16:48.614742 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-x9swj" podUID="a22f47dc-59ce-4cce-821c-508fc14a9508" Mar 20 07:16:48 crc kubenswrapper[4749]: E0320 07:16:48.691327 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 07:16:48 crc kubenswrapper[4749]: E0320 07:16:48.691726 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ckwfm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bbjcq_openshift-marketplace(ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 07:16:48 crc kubenswrapper[4749]: E0320 07:16:48.692986 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bbjcq" podUID="ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0" Mar 20 07:16:50 crc kubenswrapper[4749]: I0320 07:16:50.714341 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pwjp" Mar 20 07:16:52 crc kubenswrapper[4749]: E0320 07:16:52.108520 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bbjcq" podUID="ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0" Mar 20 07:16:52 crc kubenswrapper[4749]: I0320 07:16:52.117900 4749 scope.go:117] "RemoveContainer" containerID="32ae71a88ffe40e367c62e61dd4e8374d45e24169ad616ac80d6e8e69bbaa43b" Mar 20 07:16:52 crc kubenswrapper[4749]: E0320 07:16:52.192417 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 07:16:52 crc kubenswrapper[4749]: E0320 07:16:52.192606 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rlzhl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-lq4rb_openshift-marketplace(937dac41-5afa-495a-9909-1152a419549c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 07:16:52 crc kubenswrapper[4749]: E0320 07:16:52.194598 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-lq4rb" podUID="937dac41-5afa-495a-9909-1152a419549c" Mar 20 07:16:52 crc kubenswrapper[4749]: E0320 07:16:52.219924 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 07:16:52 crc kubenswrapper[4749]: E0320 07:16:52.220055 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d65vz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-9ss5d_openshift-marketplace(a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 07:16:52 crc kubenswrapper[4749]: E0320 07:16:52.221265 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-9ss5d" podUID="a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8" Mar 20 07:16:52 crc kubenswrapper[4749]: E0320 07:16:52.277142 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 07:16:52 crc kubenswrapper[4749]: E0320 07:16:52.277600 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d4zwd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-m7xc9_openshift-marketplace(8faad596-00ed-4982-9f42-2f1a2465098c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 07:16:52 crc kubenswrapper[4749]: E0320 07:16:52.278925 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-m7xc9" podUID="8faad596-00ed-4982-9f42-2f1a2465098c" Mar 20 07:16:52 crc kubenswrapper[4749]: I0320 07:16:52.337592 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l"] Mar 20 07:16:52 crc kubenswrapper[4749]: W0320 07:16:52.400746 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cd57b81_ff84_4d13_902f_36d9368d7421.slice/crio-c02b684a0c72726b6a643acf5b4274e6e8353969331145b39f3c9f9419419258 WatchSource:0}: Error finding container c02b684a0c72726b6a643acf5b4274e6e8353969331145b39f3c9f9419419258: Status 404 returned error can't find the container with id c02b684a0c72726b6a643acf5b4274e6e8353969331145b39f3c9f9419419258 Mar 20 07:16:52 crc kubenswrapper[4749]: I0320 07:16:52.465627 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8d8fb768b-2zhpc"] Mar 20 07:16:52 crc kubenswrapper[4749]: W0320 07:16:52.470203 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda24c70a4_ec51_459d_a464_65dd450c2366.slice/crio-3e9a49aa4604624cdf36f853a7a727a362cda12e3cc66f20fb1e94c8ea959095 WatchSource:0}: Error finding container 3e9a49aa4604624cdf36f853a7a727a362cda12e3cc66f20fb1e94c8ea959095: Status 404 returned error can't find the container with id 3e9a49aa4604624cdf36f853a7a727a362cda12e3cc66f20fb1e94c8ea959095 Mar 20 07:16:52 crc kubenswrapper[4749]: I0320 07:16:52.535558 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 07:16:52 crc kubenswrapper[4749]: E0320 07:16:52.710764 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 07:16:52 crc kubenswrapper[4749]: E0320 07:16:52.711179 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-txjch,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-szw2w_openshift-marketplace(cd2462fa-d077-4466-8930-6f2e69938c1b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 07:16:52 crc kubenswrapper[4749]: E0320 07:16:52.712338 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-szw2w" podUID="cd2462fa-d077-4466-8930-6f2e69938c1b" Mar 20 07:16:52 crc kubenswrapper[4749]: I0320 07:16:52.754847 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 07:16:52 crc kubenswrapper[4749]: I0320 07:16:52.762610 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 07:16:52 crc kubenswrapper[4749]: I0320 07:16:52.764934 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 07:16:52 crc kubenswrapper[4749]: I0320 07:16:52.765046 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 07:16:52 crc kubenswrapper[4749]: I0320 07:16:52.784724 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 07:16:52 crc kubenswrapper[4749]: I0320 07:16:52.921777 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8713a7f7-3787-44c0-9bb4-d7ce813203a9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8713a7f7-3787-44c0-9bb4-d7ce813203a9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 07:16:52 crc kubenswrapper[4749]: I0320 07:16:52.921851 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8713a7f7-3787-44c0-9bb4-d7ce813203a9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8713a7f7-3787-44c0-9bb4-d7ce813203a9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 07:16:52 crc kubenswrapper[4749]: I0320 07:16:52.982512 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" event={"ID":"a24c70a4-ec51-459d-a464-65dd450c2366","Type":"ContainerStarted","Data":"8d8776c7ee0b6c89af483156c57775e92b79920a72800eb0f3d1b5e91e32ffa3"} Mar 20 07:16:52 crc kubenswrapper[4749]: I0320 07:16:52.982564 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" event={"ID":"a24c70a4-ec51-459d-a464-65dd450c2366","Type":"ContainerStarted","Data":"3e9a49aa4604624cdf36f853a7a727a362cda12e3cc66f20fb1e94c8ea959095"} Mar 20 07:16:52 crc kubenswrapper[4749]: I0320 07:16:52.983002 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" Mar 20 07:16:52 crc kubenswrapper[4749]: I0320 07:16:52.984976 4749 generic.go:334] "Generic (PLEG): container finished" podID="9c486dab-86dd-44dd-8c82-4c07ed84aa50" containerID="605181e8839a7bab36bf33578fe78f629ef89f82b77f709f1ed2a8398683a2d0" exitCode=0 Mar 20 07:16:52 crc kubenswrapper[4749]: I0320 07:16:52.985033 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l72fj" event={"ID":"9c486dab-86dd-44dd-8c82-4c07ed84aa50","Type":"ContainerDied","Data":"605181e8839a7bab36bf33578fe78f629ef89f82b77f709f1ed2a8398683a2d0"} Mar 20 07:16:52 crc kubenswrapper[4749]: I0320 07:16:52.987531 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" Mar 20 07:16:52 crc kubenswrapper[4749]: I0320 07:16:52.989232 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l" event={"ID":"0cd57b81-ff84-4d13-902f-36d9368d7421","Type":"ContainerStarted","Data":"333e034b210f8b92b8a6b5fc11c52b1959b9c87ce7e1492903632b9e3e27ff34"} Mar 20 07:16:52 crc kubenswrapper[4749]: I0320 07:16:52.989272 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l" event={"ID":"0cd57b81-ff84-4d13-902f-36d9368d7421","Type":"ContainerStarted","Data":"c02b684a0c72726b6a643acf5b4274e6e8353969331145b39f3c9f9419419258"} Mar 20 07:16:52 crc kubenswrapper[4749]: E0320 07:16:52.992228 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-m7xc9" podUID="8faad596-00ed-4982-9f42-2f1a2465098c" Mar 20 07:16:52 crc kubenswrapper[4749]: E0320 07:16:52.992336 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-szw2w" podUID="cd2462fa-d077-4466-8930-6f2e69938c1b" Mar 20 07:16:52 crc kubenswrapper[4749]: E0320 07:16:52.992860 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9ss5d" podUID="a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8" Mar 20 07:16:52 crc kubenswrapper[4749]: E0320 07:16:52.993035 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lq4rb" podUID="937dac41-5afa-495a-9909-1152a419549c" Mar 20 07:16:52 crc kubenswrapper[4749]: I0320 07:16:52.998548 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" podStartSLOduration=16.99853198 podStartE2EDuration="16.99853198s" podCreationTimestamp="2026-03-20 07:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:52.998170561 +0000 UTC m=+249.547828198" watchObservedRunningTime="2026-03-20 07:16:52.99853198 +0000 UTC m=+249.548189617" Mar 20 07:16:53 crc kubenswrapper[4749]: I0320 07:16:53.022893 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8713a7f7-3787-44c0-9bb4-d7ce813203a9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8713a7f7-3787-44c0-9bb4-d7ce813203a9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 07:16:53 crc kubenswrapper[4749]: I0320 07:16:53.022991 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8713a7f7-3787-44c0-9bb4-d7ce813203a9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8713a7f7-3787-44c0-9bb4-d7ce813203a9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 07:16:53 crc kubenswrapper[4749]: I0320 07:16:53.023091 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8713a7f7-3787-44c0-9bb4-d7ce813203a9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8713a7f7-3787-44c0-9bb4-d7ce813203a9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 07:16:53 crc kubenswrapper[4749]: I0320 07:16:53.029421 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l" podStartSLOduration=17.029403453 podStartE2EDuration="17.029403453s" podCreationTimestamp="2026-03-20 07:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:53.023570153 +0000 UTC m=+249.573227800" watchObservedRunningTime="2026-03-20 07:16:53.029403453 +0000 UTC m=+249.579061100" Mar 20 07:16:53 crc kubenswrapper[4749]: I0320 07:16:53.043782 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8713a7f7-3787-44c0-9bb4-d7ce813203a9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8713a7f7-3787-44c0-9bb4-d7ce813203a9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 07:16:53 crc kubenswrapper[4749]: I0320 07:16:53.081082 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 07:16:53 crc kubenswrapper[4749]: I0320 07:16:53.554664 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 07:16:53 crc kubenswrapper[4749]: W0320 07:16:53.567145 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8713a7f7_3787_44c0_9bb4_d7ce813203a9.slice/crio-9ec221e047b3eb08f5d2fa942b6426eeb09e2ceaccfdc312cfc68930a0ded10f WatchSource:0}: Error finding container 9ec221e047b3eb08f5d2fa942b6426eeb09e2ceaccfdc312cfc68930a0ded10f: Status 404 returned error can't find the container with id 9ec221e047b3eb08f5d2fa942b6426eeb09e2ceaccfdc312cfc68930a0ded10f Mar 20 07:16:53 crc kubenswrapper[4749]: I0320 07:16:53.996725 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l72fj" event={"ID":"9c486dab-86dd-44dd-8c82-4c07ed84aa50","Type":"ContainerStarted","Data":"39fe9be0f912e80fe35c038568c54aa7f8ca9de79b1998dbdfc3281167af819e"} Mar 20 07:16:53 crc kubenswrapper[4749]: I0320 07:16:53.999395 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8713a7f7-3787-44c0-9bb4-d7ce813203a9","Type":"ContainerStarted","Data":"fb0729a9e35e565a33ae3c7214cbf4115bf6bb0174750f9f3de549745578b57c"} Mar 20 07:16:53 crc kubenswrapper[4749]: I0320 07:16:53.999428 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8713a7f7-3787-44c0-9bb4-d7ce813203a9","Type":"ContainerStarted","Data":"9ec221e047b3eb08f5d2fa942b6426eeb09e2ceaccfdc312cfc68930a0ded10f"} Mar 20 07:16:53 crc kubenswrapper[4749]: I0320 07:16:53.999683 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l" Mar 20 07:16:54 crc kubenswrapper[4749]: I0320 07:16:54.004170 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l" Mar 20 07:16:54 crc kubenswrapper[4749]: I0320 07:16:54.036674 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l72fj" podStartSLOduration=3.060753664 podStartE2EDuration="37.036658123s" podCreationTimestamp="2026-03-20 07:16:17 +0000 UTC" firstStartedPulling="2026-03-20 07:16:19.446975488 +0000 UTC m=+215.996633135" lastFinishedPulling="2026-03-20 07:16:53.422879947 +0000 UTC m=+249.972537594" observedRunningTime="2026-03-20 07:16:54.03534391 +0000 UTC m=+250.585001557" watchObservedRunningTime="2026-03-20 07:16:54.036658123 +0000 UTC m=+250.586315770" Mar 20 07:16:54 crc kubenswrapper[4749]: I0320 07:16:54.198490 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.198468572 podStartE2EDuration="2.198468572s" podCreationTimestamp="2026-03-20 07:16:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:16:54.08398249 +0000 UTC m=+250.633640137" watchObservedRunningTime="2026-03-20 07:16:54.198468572 +0000 UTC m=+250.748126219" Mar 20 07:16:55 crc kubenswrapper[4749]: I0320 07:16:55.004914 4749 generic.go:334] "Generic (PLEG): container finished" podID="8713a7f7-3787-44c0-9bb4-d7ce813203a9" containerID="fb0729a9e35e565a33ae3c7214cbf4115bf6bb0174750f9f3de549745578b57c" exitCode=0 Mar 20 07:16:55 crc kubenswrapper[4749]: I0320 07:16:55.005102 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8713a7f7-3787-44c0-9bb4-d7ce813203a9","Type":"ContainerDied","Data":"fb0729a9e35e565a33ae3c7214cbf4115bf6bb0174750f9f3de549745578b57c"} Mar 20 07:16:55 crc kubenswrapper[4749]: I0320 07:16:55.809175 4749 csr.go:261] certificate signing request csr-xrwjg is approved, waiting to be issued Mar 20 07:16:55 crc kubenswrapper[4749]: I0320 07:16:55.816667 4749 csr.go:257] certificate signing request csr-xrwjg is issued Mar 20 07:16:56 crc kubenswrapper[4749]: I0320 07:16:56.011076 4749 generic.go:334] "Generic (PLEG): container finished" podID="da95cd86-f90a-4d7f-a308-4124b22d8427" containerID="dc31afc519505de07be42767bdf62bf2b5ca3a71c120b605dc393154acc3985b" exitCode=0 Mar 20 07:16:56 crc kubenswrapper[4749]: I0320 07:16:56.011153 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566516-6stbk" event={"ID":"da95cd86-f90a-4d7f-a308-4124b22d8427","Type":"ContainerDied","Data":"dc31afc519505de07be42767bdf62bf2b5ca3a71c120b605dc393154acc3985b"} Mar 20 07:16:56 crc kubenswrapper[4749]: I0320 07:16:56.252111 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 07:16:56 crc kubenswrapper[4749]: I0320 07:16:56.370250 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8713a7f7-3787-44c0-9bb4-d7ce813203a9-kubelet-dir\") pod \"8713a7f7-3787-44c0-9bb4-d7ce813203a9\" (UID: \"8713a7f7-3787-44c0-9bb4-d7ce813203a9\") " Mar 20 07:16:56 crc kubenswrapper[4749]: I0320 07:16:56.370346 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8713a7f7-3787-44c0-9bb4-d7ce813203a9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8713a7f7-3787-44c0-9bb4-d7ce813203a9" (UID: "8713a7f7-3787-44c0-9bb4-d7ce813203a9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:16:56 crc kubenswrapper[4749]: I0320 07:16:56.370366 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8713a7f7-3787-44c0-9bb4-d7ce813203a9-kube-api-access\") pod \"8713a7f7-3787-44c0-9bb4-d7ce813203a9\" (UID: \"8713a7f7-3787-44c0-9bb4-d7ce813203a9\") " Mar 20 07:16:56 crc kubenswrapper[4749]: I0320 07:16:56.370599 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8713a7f7-3787-44c0-9bb4-d7ce813203a9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:56 crc kubenswrapper[4749]: I0320 07:16:56.375823 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8713a7f7-3787-44c0-9bb4-d7ce813203a9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8713a7f7-3787-44c0-9bb4-d7ce813203a9" (UID: "8713a7f7-3787-44c0-9bb4-d7ce813203a9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:16:56 crc kubenswrapper[4749]: I0320 07:16:56.471772 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8713a7f7-3787-44c0-9bb4-d7ce813203a9-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:56 crc kubenswrapper[4749]: I0320 07:16:56.613871 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8d8fb768b-2zhpc"] Mar 20 07:16:56 crc kubenswrapper[4749]: I0320 07:16:56.614072 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" podUID="a24c70a4-ec51-459d-a464-65dd450c2366" containerName="controller-manager" containerID="cri-o://8d8776c7ee0b6c89af483156c57775e92b79920a72800eb0f3d1b5e91e32ffa3" gracePeriod=30 Mar 20 07:16:56 crc kubenswrapper[4749]: I0320 07:16:56.715606 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l"] Mar 20 07:16:56 crc kubenswrapper[4749]: I0320 07:16:56.817813 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-09 12:30:06.214024405 +0000 UTC Mar 20 07:16:56 crc kubenswrapper[4749]: I0320 07:16:56.817845 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6341h13m9.396182425s for next certificate rotation Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.019253 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8713a7f7-3787-44c0-9bb4-d7ce813203a9","Type":"ContainerDied","Data":"9ec221e047b3eb08f5d2fa942b6426eeb09e2ceaccfdc312cfc68930a0ded10f"} Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.019336 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ec221e047b3eb08f5d2fa942b6426eeb09e2ceaccfdc312cfc68930a0ded10f" Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.019272 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.021932 4749 generic.go:334] "Generic (PLEG): container finished" podID="a24c70a4-ec51-459d-a464-65dd450c2366" containerID="8d8776c7ee0b6c89af483156c57775e92b79920a72800eb0f3d1b5e91e32ffa3" exitCode=0 Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.021967 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" event={"ID":"a24c70a4-ec51-459d-a464-65dd450c2366","Type":"ContainerDied","Data":"8d8776c7ee0b6c89af483156c57775e92b79920a72800eb0f3d1b5e91e32ffa3"} Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.022006 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" event={"ID":"a24c70a4-ec51-459d-a464-65dd450c2366","Type":"ContainerDied","Data":"3e9a49aa4604624cdf36f853a7a727a362cda12e3cc66f20fb1e94c8ea959095"} Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.022027 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e9a49aa4604624cdf36f853a7a727a362cda12e3cc66f20fb1e94c8ea959095" Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.022114 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l" podUID="0cd57b81-ff84-4d13-902f-36d9368d7421" containerName="route-controller-manager" containerID="cri-o://333e034b210f8b92b8a6b5fc11c52b1959b9c87ce7e1492903632b9e3e27ff34" gracePeriod=30 Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.086009 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.180072 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a24c70a4-ec51-459d-a464-65dd450c2366-config\") pod \"a24c70a4-ec51-459d-a464-65dd450c2366\" (UID: \"a24c70a4-ec51-459d-a464-65dd450c2366\") " Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.180180 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a24c70a4-ec51-459d-a464-65dd450c2366-serving-cert\") pod \"a24c70a4-ec51-459d-a464-65dd450c2366\" (UID: \"a24c70a4-ec51-459d-a464-65dd450c2366\") " Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.180217 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a24c70a4-ec51-459d-a464-65dd450c2366-client-ca\") pod \"a24c70a4-ec51-459d-a464-65dd450c2366\" (UID: \"a24c70a4-ec51-459d-a464-65dd450c2366\") " Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.180250 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v66qh\" (UniqueName: \"kubernetes.io/projected/a24c70a4-ec51-459d-a464-65dd450c2366-kube-api-access-v66qh\") pod \"a24c70a4-ec51-459d-a464-65dd450c2366\" (UID: \"a24c70a4-ec51-459d-a464-65dd450c2366\") " Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.180440 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a24c70a4-ec51-459d-a464-65dd450c2366-proxy-ca-bundles\") pod \"a24c70a4-ec51-459d-a464-65dd450c2366\" (UID: \"a24c70a4-ec51-459d-a464-65dd450c2366\") " Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.181040 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a24c70a4-ec51-459d-a464-65dd450c2366-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a24c70a4-ec51-459d-a464-65dd450c2366" (UID: "a24c70a4-ec51-459d-a464-65dd450c2366"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.181055 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a24c70a4-ec51-459d-a464-65dd450c2366-client-ca" (OuterVolumeSpecName: "client-ca") pod "a24c70a4-ec51-459d-a464-65dd450c2366" (UID: "a24c70a4-ec51-459d-a464-65dd450c2366"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.181089 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a24c70a4-ec51-459d-a464-65dd450c2366-config" (OuterVolumeSpecName: "config") pod "a24c70a4-ec51-459d-a464-65dd450c2366" (UID: "a24c70a4-ec51-459d-a464-65dd450c2366"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.186990 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a24c70a4-ec51-459d-a464-65dd450c2366-kube-api-access-v66qh" (OuterVolumeSpecName: "kube-api-access-v66qh") pod "a24c70a4-ec51-459d-a464-65dd450c2366" (UID: "a24c70a4-ec51-459d-a464-65dd450c2366"). InnerVolumeSpecName "kube-api-access-v66qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.187513 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a24c70a4-ec51-459d-a464-65dd450c2366-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a24c70a4-ec51-459d-a464-65dd450c2366" (UID: "a24c70a4-ec51-459d-a464-65dd450c2366"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.263216 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566516-6stbk" Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.289406 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a24c70a4-ec51-459d-a464-65dd450c2366-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.289450 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a24c70a4-ec51-459d-a464-65dd450c2366-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.289465 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a24c70a4-ec51-459d-a464-65dd450c2366-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.289477 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a24c70a4-ec51-459d-a464-65dd450c2366-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.289496 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v66qh\" (UniqueName: \"kubernetes.io/projected/a24c70a4-ec51-459d-a464-65dd450c2366-kube-api-access-v66qh\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.390760 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7xkd\" (UniqueName: \"kubernetes.io/projected/da95cd86-f90a-4d7f-a308-4124b22d8427-kube-api-access-r7xkd\") pod \"da95cd86-f90a-4d7f-a308-4124b22d8427\" (UID: \"da95cd86-f90a-4d7f-a308-4124b22d8427\") " Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.394782 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da95cd86-f90a-4d7f-a308-4124b22d8427-kube-api-access-r7xkd" (OuterVolumeSpecName: "kube-api-access-r7xkd") pod "da95cd86-f90a-4d7f-a308-4124b22d8427" (UID: "da95cd86-f90a-4d7f-a308-4124b22d8427"). InnerVolumeSpecName "kube-api-access-r7xkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.492923 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7xkd\" (UniqueName: \"kubernetes.io/projected/da95cd86-f90a-4d7f-a308-4124b22d8427-kube-api-access-r7xkd\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.818652 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-17 00:34:20.891404045 +0000 UTC Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.818688 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7265h17m23.072718729s for next certificate rotation Mar 20 07:16:57 crc kubenswrapper[4749]: I0320 07:16:57.840948 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.001944 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cd57b81-ff84-4d13-902f-36d9368d7421-config\") pod \"0cd57b81-ff84-4d13-902f-36d9368d7421\" (UID: \"0cd57b81-ff84-4d13-902f-36d9368d7421\") " Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.002036 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cd57b81-ff84-4d13-902f-36d9368d7421-serving-cert\") pod \"0cd57b81-ff84-4d13-902f-36d9368d7421\" (UID: \"0cd57b81-ff84-4d13-902f-36d9368d7421\") " Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.002159 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j45cf\" (UniqueName: \"kubernetes.io/projected/0cd57b81-ff84-4d13-902f-36d9368d7421-kube-api-access-j45cf\") pod \"0cd57b81-ff84-4d13-902f-36d9368d7421\" (UID: \"0cd57b81-ff84-4d13-902f-36d9368d7421\") " Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.002200 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cd57b81-ff84-4d13-902f-36d9368d7421-client-ca\") pod \"0cd57b81-ff84-4d13-902f-36d9368d7421\" (UID: \"0cd57b81-ff84-4d13-902f-36d9368d7421\") " Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.002971 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cd57b81-ff84-4d13-902f-36d9368d7421-config" (OuterVolumeSpecName: "config") pod "0cd57b81-ff84-4d13-902f-36d9368d7421" (UID: "0cd57b81-ff84-4d13-902f-36d9368d7421"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.003512 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cd57b81-ff84-4d13-902f-36d9368d7421-client-ca" (OuterVolumeSpecName: "client-ca") pod "0cd57b81-ff84-4d13-902f-36d9368d7421" (UID: "0cd57b81-ff84-4d13-902f-36d9368d7421"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.005778 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cd57b81-ff84-4d13-902f-36d9368d7421-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0cd57b81-ff84-4d13-902f-36d9368d7421" (UID: "0cd57b81-ff84-4d13-902f-36d9368d7421"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.006188 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cd57b81-ff84-4d13-902f-36d9368d7421-kube-api-access-j45cf" (OuterVolumeSpecName: "kube-api-access-j45cf") pod "0cd57b81-ff84-4d13-902f-36d9368d7421" (UID: "0cd57b81-ff84-4d13-902f-36d9368d7421"). InnerVolumeSpecName "kube-api-access-j45cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.028786 4749 generic.go:334] "Generic (PLEG): container finished" podID="0cd57b81-ff84-4d13-902f-36d9368d7421" containerID="333e034b210f8b92b8a6b5fc11c52b1959b9c87ce7e1492903632b9e3e27ff34" exitCode=0 Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.028858 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l" event={"ID":"0cd57b81-ff84-4d13-902f-36d9368d7421","Type":"ContainerDied","Data":"333e034b210f8b92b8a6b5fc11c52b1959b9c87ce7e1492903632b9e3e27ff34"} Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.028891 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l" event={"ID":"0cd57b81-ff84-4d13-902f-36d9368d7421","Type":"ContainerDied","Data":"c02b684a0c72726b6a643acf5b4274e6e8353969331145b39f3c9f9419419258"} Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.028914 4749 scope.go:117] "RemoveContainer" containerID="333e034b210f8b92b8a6b5fc11c52b1959b9c87ce7e1492903632b9e3e27ff34" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.029011 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.032753 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566516-6stbk" event={"ID":"da95cd86-f90a-4d7f-a308-4124b22d8427","Type":"ContainerDied","Data":"74944d5ca2ebb3420955970757f2337dd3fbfccd0cffb9aa2b4076c3cffc8f05"} Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.032793 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566516-6stbk" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.032803 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74944d5ca2ebb3420955970757f2337dd3fbfccd0cffb9aa2b4076c3cffc8f05" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.032781 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8d8fb768b-2zhpc" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.045612 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7"] Mar 20 07:16:58 crc kubenswrapper[4749]: E0320 07:16:58.046024 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a24c70a4-ec51-459d-a464-65dd450c2366" containerName="controller-manager" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.046041 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24c70a4-ec51-459d-a464-65dd450c2366" containerName="controller-manager" Mar 20 07:16:58 crc kubenswrapper[4749]: E0320 07:16:58.046076 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da95cd86-f90a-4d7f-a308-4124b22d8427" containerName="oc" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.046085 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="da95cd86-f90a-4d7f-a308-4124b22d8427" containerName="oc" Mar 20 07:16:58 crc kubenswrapper[4749]: E0320 07:16:58.046101 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8713a7f7-3787-44c0-9bb4-d7ce813203a9" containerName="pruner" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.046108 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8713a7f7-3787-44c0-9bb4-d7ce813203a9" containerName="pruner" Mar 20 07:16:58 crc kubenswrapper[4749]: E0320 07:16:58.046157 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cd57b81-ff84-4d13-902f-36d9368d7421" containerName="route-controller-manager" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.046169 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cd57b81-ff84-4d13-902f-36d9368d7421" containerName="route-controller-manager" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.046362 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a24c70a4-ec51-459d-a464-65dd450c2366" containerName="controller-manager" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.046398 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8713a7f7-3787-44c0-9bb4-d7ce813203a9" containerName="pruner" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.046408 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="da95cd86-f90a-4d7f-a308-4124b22d8427" containerName="oc" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.046420 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cd57b81-ff84-4d13-902f-36d9368d7421" containerName="route-controller-manager" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.046962 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.048812 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.055521 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.056872 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.057158 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.057369 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.057447 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.058992 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb"] Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.059856 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.064180 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7"] Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.064585 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.064682 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.064733 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.066368 4749 scope.go:117] "RemoveContainer" containerID="333e034b210f8b92b8a6b5fc11c52b1959b9c87ce7e1492903632b9e3e27ff34" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.068305 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.068340 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.071622 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.072831 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb"] Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.073192 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.093201 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l72fj" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.093241 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l72fj" Mar 20 07:16:58 crc kubenswrapper[4749]: E0320 07:16:58.095816 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"333e034b210f8b92b8a6b5fc11c52b1959b9c87ce7e1492903632b9e3e27ff34\": container with ID starting with 333e034b210f8b92b8a6b5fc11c52b1959b9c87ce7e1492903632b9e3e27ff34 not found: ID does not exist" containerID="333e034b210f8b92b8a6b5fc11c52b1959b9c87ce7e1492903632b9e3e27ff34" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.095972 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"333e034b210f8b92b8a6b5fc11c52b1959b9c87ce7e1492903632b9e3e27ff34"} err="failed to get container status \"333e034b210f8b92b8a6b5fc11c52b1959b9c87ce7e1492903632b9e3e27ff34\": rpc error: code = NotFound desc = could not find container \"333e034b210f8b92b8a6b5fc11c52b1959b9c87ce7e1492903632b9e3e27ff34\": container with ID starting with 333e034b210f8b92b8a6b5fc11c52b1959b9c87ce7e1492903632b9e3e27ff34 not found: ID does not exist" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.103779 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j45cf\" (UniqueName: \"kubernetes.io/projected/0cd57b81-ff84-4d13-902f-36d9368d7421-kube-api-access-j45cf\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.103814 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cd57b81-ff84-4d13-902f-36d9368d7421-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.103829 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cd57b81-ff84-4d13-902f-36d9368d7421-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.103839 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cd57b81-ff84-4d13-902f-36d9368d7421-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.150578 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.151444 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.157271 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.157467 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.158884 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8d8fb768b-2zhpc"] Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.163694 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-8d8fb768b-2zhpc"] Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.171712 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.177110 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l"] Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.177161 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5787f8cc8-vkt6l"] Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.187193 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cd57b81-ff84-4d13-902f-36d9368d7421" path="/var/lib/kubelet/pods/0cd57b81-ff84-4d13-902f-36d9368d7421/volumes" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.188498 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a24c70a4-ec51-459d-a464-65dd450c2366" path="/var/lib/kubelet/pods/a24c70a4-ec51-459d-a464-65dd450c2366/volumes" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.204446 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2871df9-37c9-45aa-a6f9-af85d64f7615-serving-cert\") pod \"controller-manager-7c9d8ddb97-k2lf7\" (UID: \"c2871df9-37c9-45aa-a6f9-af85d64f7615\") " pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.204499 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2871df9-37c9-45aa-a6f9-af85d64f7615-config\") pod \"controller-manager-7c9d8ddb97-k2lf7\" (UID: \"c2871df9-37c9-45aa-a6f9-af85d64f7615\") " pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.204526 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2871df9-37c9-45aa-a6f9-af85d64f7615-client-ca\") pod \"controller-manager-7c9d8ddb97-k2lf7\" (UID: \"c2871df9-37c9-45aa-a6f9-af85d64f7615\") " pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.204611 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rnbq\" (UniqueName: \"kubernetes.io/projected/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7-kube-api-access-4rnbq\") pod \"route-controller-manager-6b77d896c5-9fgdb\" (UID: \"6551fdd2-b4de-4478-aa69-c1c2bb35d1b7\") " pod="openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.204678 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2871df9-37c9-45aa-a6f9-af85d64f7615-proxy-ca-bundles\") pod \"controller-manager-7c9d8ddb97-k2lf7\" (UID: \"c2871df9-37c9-45aa-a6f9-af85d64f7615\") " pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.204699 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnspv\" (UniqueName: \"kubernetes.io/projected/c2871df9-37c9-45aa-a6f9-af85d64f7615-kube-api-access-vnspv\") pod \"controller-manager-7c9d8ddb97-k2lf7\" (UID: \"c2871df9-37c9-45aa-a6f9-af85d64f7615\") " pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.204729 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7-client-ca\") pod \"route-controller-manager-6b77d896c5-9fgdb\" (UID: \"6551fdd2-b4de-4478-aa69-c1c2bb35d1b7\") " pod="openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.204746 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7-serving-cert\") pod \"route-controller-manager-6b77d896c5-9fgdb\" (UID: \"6551fdd2-b4de-4478-aa69-c1c2bb35d1b7\") " pod="openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.204795 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7-config\") pod \"route-controller-manager-6b77d896c5-9fgdb\" (UID: \"6551fdd2-b4de-4478-aa69-c1c2bb35d1b7\") " pod="openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.306183 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/30422da4-1696-4beb-be35-4216a61c897c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"30422da4-1696-4beb-be35-4216a61c897c\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.306236 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2871df9-37c9-45aa-a6f9-af85d64f7615-client-ca\") pod \"controller-manager-7c9d8ddb97-k2lf7\" (UID: \"c2871df9-37c9-45aa-a6f9-af85d64f7615\") " pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.307108 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2871df9-37c9-45aa-a6f9-af85d64f7615-client-ca\") pod \"controller-manager-7c9d8ddb97-k2lf7\" (UID: \"c2871df9-37c9-45aa-a6f9-af85d64f7615\") " pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.306267 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rnbq\" (UniqueName: \"kubernetes.io/projected/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7-kube-api-access-4rnbq\") pod \"route-controller-manager-6b77d896c5-9fgdb\" (UID: \"6551fdd2-b4de-4478-aa69-c1c2bb35d1b7\") " pod="openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.307200 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2871df9-37c9-45aa-a6f9-af85d64f7615-proxy-ca-bundles\") pod \"controller-manager-7c9d8ddb97-k2lf7\" (UID: \"c2871df9-37c9-45aa-a6f9-af85d64f7615\") " pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.307220 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnspv\" (UniqueName: \"kubernetes.io/projected/c2871df9-37c9-45aa-a6f9-af85d64f7615-kube-api-access-vnspv\") pod \"controller-manager-7c9d8ddb97-k2lf7\" (UID: \"c2871df9-37c9-45aa-a6f9-af85d64f7615\") " pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.308065 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7-client-ca\") pod \"route-controller-manager-6b77d896c5-9fgdb\" (UID: \"6551fdd2-b4de-4478-aa69-c1c2bb35d1b7\") " pod="openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.308358 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2871df9-37c9-45aa-a6f9-af85d64f7615-proxy-ca-bundles\") pod \"controller-manager-7c9d8ddb97-k2lf7\" (UID: \"c2871df9-37c9-45aa-a6f9-af85d64f7615\") " pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.307236 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7-client-ca\") pod \"route-controller-manager-6b77d896c5-9fgdb\" (UID: \"6551fdd2-b4de-4478-aa69-c1c2bb35d1b7\") " pod="openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.308419 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7-serving-cert\") pod \"route-controller-manager-6b77d896c5-9fgdb\" (UID: \"6551fdd2-b4de-4478-aa69-c1c2bb35d1b7\") " pod="openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.308437 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7-config\") pod \"route-controller-manager-6b77d896c5-9fgdb\" (UID: \"6551fdd2-b4de-4478-aa69-c1c2bb35d1b7\") " pod="openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.308897 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/30422da4-1696-4beb-be35-4216a61c897c-var-lock\") pod \"installer-9-crc\" (UID: \"30422da4-1696-4beb-be35-4216a61c897c\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.308930 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2871df9-37c9-45aa-a6f9-af85d64f7615-serving-cert\") pod \"controller-manager-7c9d8ddb97-k2lf7\" (UID: \"c2871df9-37c9-45aa-a6f9-af85d64f7615\") " pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.308969 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30422da4-1696-4beb-be35-4216a61c897c-kube-api-access\") pod \"installer-9-crc\" (UID: \"30422da4-1696-4beb-be35-4216a61c897c\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.308990 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2871df9-37c9-45aa-a6f9-af85d64f7615-config\") pod \"controller-manager-7c9d8ddb97-k2lf7\" (UID: \"c2871df9-37c9-45aa-a6f9-af85d64f7615\") " pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.309973 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2871df9-37c9-45aa-a6f9-af85d64f7615-config\") pod \"controller-manager-7c9d8ddb97-k2lf7\" (UID: \"c2871df9-37c9-45aa-a6f9-af85d64f7615\") " pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.311083 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7-serving-cert\") pod \"route-controller-manager-6b77d896c5-9fgdb\" (UID: \"6551fdd2-b4de-4478-aa69-c1c2bb35d1b7\") " pod="openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.311646 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2871df9-37c9-45aa-a6f9-af85d64f7615-serving-cert\") pod \"controller-manager-7c9d8ddb97-k2lf7\" (UID: \"c2871df9-37c9-45aa-a6f9-af85d64f7615\") " pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.313782 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7-config\") pod \"route-controller-manager-6b77d896c5-9fgdb\" (UID: \"6551fdd2-b4de-4478-aa69-c1c2bb35d1b7\") " pod="openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.326614 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rnbq\" (UniqueName: \"kubernetes.io/projected/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7-kube-api-access-4rnbq\") pod \"route-controller-manager-6b77d896c5-9fgdb\" (UID: \"6551fdd2-b4de-4478-aa69-c1c2bb35d1b7\") " pod="openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.334650 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnspv\" (UniqueName: \"kubernetes.io/projected/c2871df9-37c9-45aa-a6f9-af85d64f7615-kube-api-access-vnspv\") pod \"controller-manager-7c9d8ddb97-k2lf7\" (UID: \"c2871df9-37c9-45aa-a6f9-af85d64f7615\") " pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.410247 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30422da4-1696-4beb-be35-4216a61c897c-kube-api-access\") pod \"installer-9-crc\" (UID: \"30422da4-1696-4beb-be35-4216a61c897c\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.410611 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/30422da4-1696-4beb-be35-4216a61c897c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"30422da4-1696-4beb-be35-4216a61c897c\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.410686 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/30422da4-1696-4beb-be35-4216a61c897c-var-lock\") pod \"installer-9-crc\" (UID: \"30422da4-1696-4beb-be35-4216a61c897c\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.410753 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/30422da4-1696-4beb-be35-4216a61c897c-var-lock\") pod \"installer-9-crc\" (UID: \"30422da4-1696-4beb-be35-4216a61c897c\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.411023 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/30422da4-1696-4beb-be35-4216a61c897c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"30422da4-1696-4beb-be35-4216a61c897c\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.415405 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l72fj" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.425829 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30422da4-1696-4beb-be35-4216a61c897c-kube-api-access\") pod \"installer-9-crc\" (UID: \"30422da4-1696-4beb-be35-4216a61c897c\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.431633 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.448342 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.467109 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.658545 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7"] Mar 20 07:16:58 crc kubenswrapper[4749]: W0320 07:16:58.669542 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2871df9_37c9_45aa_a6f9_af85d64f7615.slice/crio-cbc5490bad36590e4af6497c53d4343fdadb78835df1d22661123308a537edf7 WatchSource:0}: Error finding container cbc5490bad36590e4af6497c53d4343fdadb78835df1d22661123308a537edf7: Status 404 returned error can't find the container with id cbc5490bad36590e4af6497c53d4343fdadb78835df1d22661123308a537edf7 Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.688213 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb"] Mar 20 07:16:58 crc kubenswrapper[4749]: W0320 07:16:58.693040 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6551fdd2_b4de_4478_aa69_c1c2bb35d1b7.slice/crio-67dad6306c37b4a373584088fcd873356cd5bbee2fdf9af527223aa7cd2687cc WatchSource:0}: Error finding container 67dad6306c37b4a373584088fcd873356cd5bbee2fdf9af527223aa7cd2687cc: Status 404 returned error can't find the container with id 67dad6306c37b4a373584088fcd873356cd5bbee2fdf9af527223aa7cd2687cc Mar 20 07:16:58 crc kubenswrapper[4749]: I0320 07:16:58.717275 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 07:16:58 crc kubenswrapper[4749]: W0320 07:16:58.746839 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod30422da4_1696_4beb_be35_4216a61c897c.slice/crio-66e94799422370bc83f0e8c9d0235ba6dedd9dd7b626f993c57d3a92b8825eda WatchSource:0}: Error finding container 66e94799422370bc83f0e8c9d0235ba6dedd9dd7b626f993c57d3a92b8825eda: Status 404 returned error can't find the container with id 66e94799422370bc83f0e8c9d0235ba6dedd9dd7b626f993c57d3a92b8825eda Mar 20 07:16:59 crc kubenswrapper[4749]: I0320 07:16:59.041216 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" event={"ID":"c2871df9-37c9-45aa-a6f9-af85d64f7615","Type":"ContainerStarted","Data":"eecc02eecdca0b76a1879506f3e3f77c9f964dbc0fc414c78f34eb3098318722"} Mar 20 07:16:59 crc kubenswrapper[4749]: I0320 07:16:59.041262 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" event={"ID":"c2871df9-37c9-45aa-a6f9-af85d64f7615","Type":"ContainerStarted","Data":"cbc5490bad36590e4af6497c53d4343fdadb78835df1d22661123308a537edf7"} Mar 20 07:16:59 crc kubenswrapper[4749]: I0320 07:16:59.045587 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"30422da4-1696-4beb-be35-4216a61c897c","Type":"ContainerStarted","Data":"66e94799422370bc83f0e8c9d0235ba6dedd9dd7b626f993c57d3a92b8825eda"} Mar 20 07:16:59 crc kubenswrapper[4749]: I0320 07:16:59.047333 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb" event={"ID":"6551fdd2-b4de-4478-aa69-c1c2bb35d1b7","Type":"ContainerStarted","Data":"96da0f68eb00a3a5cc59f119b7f6dff435df8cc1ef1ec0e0d73cac4bbb0ad58c"} Mar 20 07:16:59 crc kubenswrapper[4749]: I0320 07:16:59.047366 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb" event={"ID":"6551fdd2-b4de-4478-aa69-c1c2bb35d1b7","Type":"ContainerStarted","Data":"67dad6306c37b4a373584088fcd873356cd5bbee2fdf9af527223aa7cd2687cc"} Mar 20 07:16:59 crc kubenswrapper[4749]: I0320 07:16:59.088482 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l72fj" Mar 20 07:17:00 crc kubenswrapper[4749]: I0320 07:17:00.053814 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"30422da4-1696-4beb-be35-4216a61c897c","Type":"ContainerStarted","Data":"9a2ec5fd34ad5eac8d462ddc0672990a005d66e074fa6733c2b67116503f0fb6"} Mar 20 07:17:00 crc kubenswrapper[4749]: I0320 07:17:00.055000 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb" Mar 20 07:17:00 crc kubenswrapper[4749]: I0320 07:17:00.060419 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb" Mar 20 07:17:00 crc kubenswrapper[4749]: I0320 07:17:00.071887 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb" podStartSLOduration=4.071870371 podStartE2EDuration="4.071870371s" podCreationTimestamp="2026-03-20 07:16:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:17:00.067606612 +0000 UTC m=+256.617264259" watchObservedRunningTime="2026-03-20 07:17:00.071870371 +0000 UTC m=+256.621528018" Mar 20 07:17:00 crc kubenswrapper[4749]: I0320 07:17:00.100481 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.100461846 podStartE2EDuration="2.100461846s" podCreationTimestamp="2026-03-20 07:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:17:00.099047949 +0000 UTC m=+256.648705596" watchObservedRunningTime="2026-03-20 07:17:00.100461846 +0000 UTC m=+256.650119493" Mar 20 07:17:00 crc kubenswrapper[4749]: I0320 07:17:00.121322 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" podStartSLOduration=4.121301921 podStartE2EDuration="4.121301921s" podCreationTimestamp="2026-03-20 07:16:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:17:00.118408217 +0000 UTC m=+256.668065864" watchObservedRunningTime="2026-03-20 07:17:00.121301921 +0000 UTC m=+256.670959568" Mar 20 07:17:03 crc kubenswrapper[4749]: I0320 07:17:03.070096 4749 generic.go:334] "Generic (PLEG): container finished" podID="a22f47dc-59ce-4cce-821c-508fc14a9508" containerID="4b3e98598ca0753493c27401b8ab35e23d717b69b5e00e34418ea0ffae39cd7f" exitCode=0 Mar 20 07:17:03 crc kubenswrapper[4749]: I0320 07:17:03.070463 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9swj" event={"ID":"a22f47dc-59ce-4cce-821c-508fc14a9508","Type":"ContainerDied","Data":"4b3e98598ca0753493c27401b8ab35e23d717b69b5e00e34418ea0ffae39cd7f"} Mar 20 07:17:03 crc kubenswrapper[4749]: I0320 07:17:03.074609 4749 generic.go:334] "Generic (PLEG): container finished" podID="b7e5d15e-f3f5-4595-be01-ae4f196285ad" containerID="cec62a01f58dc1bb798827b623adb1ec9658643bbf726980bae60126c0e39af0" exitCode=0 Mar 20 07:17:03 crc kubenswrapper[4749]: I0320 07:17:03.074656 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rv5h9" event={"ID":"b7e5d15e-f3f5-4595-be01-ae4f196285ad","Type":"ContainerDied","Data":"cec62a01f58dc1bb798827b623adb1ec9658643bbf726980bae60126c0e39af0"} Mar 20 07:17:04 crc kubenswrapper[4749]: I0320 07:17:04.081714 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rv5h9" event={"ID":"b7e5d15e-f3f5-4595-be01-ae4f196285ad","Type":"ContainerStarted","Data":"5b2744784a89060251e31f3410f211c4fdd6a528ed1e0fd2cb6969196eadcefb"} Mar 20 07:17:04 crc kubenswrapper[4749]: I0320 07:17:04.083868 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbjcq" event={"ID":"ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0","Type":"ContainerStarted","Data":"2fa089fdfd234e5fbbd1f1a90f13d8163316131f2e222fbc6b2f37e8e477ace2"} Mar 20 07:17:04 crc kubenswrapper[4749]: I0320 07:17:04.086873 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9swj" event={"ID":"a22f47dc-59ce-4cce-821c-508fc14a9508","Type":"ContainerStarted","Data":"bbc52864abe0fe9cf380b57e4493cf00807794a3926cfa0edc36578936ac2d49"} Mar 20 07:17:04 crc kubenswrapper[4749]: I0320 07:17:04.102086 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rv5h9" podStartSLOduration=2.790467259 podStartE2EDuration="49.102069242s" podCreationTimestamp="2026-03-20 07:16:15 +0000 UTC" firstStartedPulling="2026-03-20 07:16:17.307529816 +0000 UTC m=+213.857187463" lastFinishedPulling="2026-03-20 07:17:03.619131799 +0000 UTC m=+260.168789446" observedRunningTime="2026-03-20 07:17:04.100711807 +0000 UTC m=+260.650369454" watchObservedRunningTime="2026-03-20 07:17:04.102069242 +0000 UTC m=+260.651726889" Mar 20 07:17:04 crc kubenswrapper[4749]: I0320 07:17:04.126517 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x9swj" podStartSLOduration=3.7734447859999998 podStartE2EDuration="49.126503431s" podCreationTimestamp="2026-03-20 07:16:15 +0000 UTC" firstStartedPulling="2026-03-20 07:16:18.376368139 +0000 UTC m=+214.926025786" lastFinishedPulling="2026-03-20 07:17:03.729426784 +0000 UTC m=+260.279084431" observedRunningTime="2026-03-20 07:17:04.121149883 +0000 UTC m=+260.670807520" watchObservedRunningTime="2026-03-20 07:17:04.126503431 +0000 UTC m=+260.676161078" Mar 20 07:17:04 crc kubenswrapper[4749]: I0320 07:17:04.514924 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:17:04 crc kubenswrapper[4749]: I0320 07:17:04.514983 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:17:04 crc kubenswrapper[4749]: I0320 07:17:04.515028 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:17:04 crc kubenswrapper[4749]: I0320 07:17:04.515571 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9"} pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 07:17:04 crc kubenswrapper[4749]: I0320 07:17:04.515620 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" containerID="cri-o://e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9" gracePeriod=600 Mar 20 07:17:05 crc kubenswrapper[4749]: I0320 07:17:05.095187 4749 generic.go:334] "Generic (PLEG): container finished" podID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerID="e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9" exitCode=0 Mar 20 07:17:05 crc kubenswrapper[4749]: I0320 07:17:05.095256 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerDied","Data":"e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9"} Mar 20 07:17:05 crc kubenswrapper[4749]: I0320 07:17:05.095941 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerStarted","Data":"4c9937006b944b57a7ace3d87b4c4a8a6a9f78e9d693469869b65f6df516a69c"} Mar 20 07:17:05 crc kubenswrapper[4749]: I0320 07:17:05.097932 4749 generic.go:334] "Generic (PLEG): container finished" podID="ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0" containerID="2fa089fdfd234e5fbbd1f1a90f13d8163316131f2e222fbc6b2f37e8e477ace2" exitCode=0 Mar 20 07:17:05 crc kubenswrapper[4749]: I0320 07:17:05.098018 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbjcq" event={"ID":"ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0","Type":"ContainerDied","Data":"2fa089fdfd234e5fbbd1f1a90f13d8163316131f2e222fbc6b2f37e8e477ace2"} Mar 20 07:17:05 crc kubenswrapper[4749]: I0320 07:17:05.101168 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ss5d" event={"ID":"a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8","Type":"ContainerStarted","Data":"b76ccba3f0a5a2989140618b2be44e4edb8fc5245e111df682092ac05537917c"} Mar 20 07:17:05 crc kubenswrapper[4749]: I0320 07:17:05.930246 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rv5h9" Mar 20 07:17:05 crc kubenswrapper[4749]: I0320 07:17:05.930322 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rv5h9" Mar 20 07:17:05 crc kubenswrapper[4749]: I0320 07:17:05.981556 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rv5h9" Mar 20 07:17:06 crc kubenswrapper[4749]: I0320 07:17:06.112531 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbjcq" event={"ID":"ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0","Type":"ContainerStarted","Data":"dc11e25d21efe015b8ce76810d8fef25160703332d24f50f8e4af7449f8e42e3"} Mar 20 07:17:06 crc kubenswrapper[4749]: I0320 07:17:06.114529 4749 generic.go:334] "Generic (PLEG): container finished" podID="a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8" containerID="b76ccba3f0a5a2989140618b2be44e4edb8fc5245e111df682092ac05537917c" exitCode=0 Mar 20 07:17:06 crc kubenswrapper[4749]: I0320 07:17:06.114576 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ss5d" event={"ID":"a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8","Type":"ContainerDied","Data":"b76ccba3f0a5a2989140618b2be44e4edb8fc5245e111df682092ac05537917c"} Mar 20 07:17:06 crc kubenswrapper[4749]: I0320 07:17:06.135974 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bbjcq" podStartSLOduration=3.053177395 podStartE2EDuration="50.13595734s" podCreationTimestamp="2026-03-20 07:16:16 +0000 UTC" firstStartedPulling="2026-03-20 07:16:18.375315522 +0000 UTC m=+214.924973169" lastFinishedPulling="2026-03-20 07:17:05.458095467 +0000 UTC m=+262.007753114" observedRunningTime="2026-03-20 07:17:06.133091875 +0000 UTC m=+262.682749522" watchObservedRunningTime="2026-03-20 07:17:06.13595734 +0000 UTC m=+262.685614987" Mar 20 07:17:06 crc kubenswrapper[4749]: I0320 07:17:06.383985 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x9swj" Mar 20 07:17:06 crc kubenswrapper[4749]: I0320 07:17:06.384032 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x9swj" Mar 20 07:17:06 crc kubenswrapper[4749]: I0320 07:17:06.427545 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x9swj" Mar 20 07:17:06 crc kubenswrapper[4749]: I0320 07:17:06.473722 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bbjcq" Mar 20 07:17:06 crc kubenswrapper[4749]: I0320 07:17:06.473770 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bbjcq" Mar 20 07:17:07 crc kubenswrapper[4749]: I0320 07:17:07.513574 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-bbjcq" podUID="ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0" containerName="registry-server" probeResult="failure" output=< Mar 20 07:17:07 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Mar 20 07:17:07 crc kubenswrapper[4749]: > Mar 20 07:17:08 crc kubenswrapper[4749]: I0320 07:17:08.133393 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ss5d" event={"ID":"a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8","Type":"ContainerStarted","Data":"96c3778c4c41d3e6222d522de25ac4d41a6d02827a8c73b25ec24e7347258c10"} Mar 20 07:17:08 crc kubenswrapper[4749]: I0320 07:17:08.135652 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lq4rb" event={"ID":"937dac41-5afa-495a-9909-1152a419549c","Type":"ContainerStarted","Data":"d92f9d3e4b5049a11264ba7179af1895262839ee476ac245385ea2a25dbab7f2"} Mar 20 07:17:08 crc kubenswrapper[4749]: I0320 07:17:08.153216 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9ss5d" podStartSLOduration=3.509402406 podStartE2EDuration="53.153193092s" podCreationTimestamp="2026-03-20 07:16:15 +0000 UTC" firstStartedPulling="2026-03-20 07:16:17.344373193 +0000 UTC m=+213.894030840" lastFinishedPulling="2026-03-20 07:17:06.988163879 +0000 UTC m=+263.537821526" observedRunningTime="2026-03-20 07:17:08.15276296 +0000 UTC m=+264.702420617" watchObservedRunningTime="2026-03-20 07:17:08.153193092 +0000 UTC m=+264.702850749" Mar 20 07:17:08 crc kubenswrapper[4749]: I0320 07:17:08.432554 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" Mar 20 07:17:08 crc kubenswrapper[4749]: I0320 07:17:08.443580 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" Mar 20 07:17:09 crc kubenswrapper[4749]: I0320 07:17:09.142466 4749 generic.go:334] "Generic (PLEG): container finished" podID="937dac41-5afa-495a-9909-1152a419549c" containerID="d92f9d3e4b5049a11264ba7179af1895262839ee476ac245385ea2a25dbab7f2" exitCode=0 Mar 20 07:17:09 crc kubenswrapper[4749]: I0320 07:17:09.142566 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lq4rb" event={"ID":"937dac41-5afa-495a-9909-1152a419549c","Type":"ContainerDied","Data":"d92f9d3e4b5049a11264ba7179af1895262839ee476ac245385ea2a25dbab7f2"} Mar 20 07:17:09 crc kubenswrapper[4749]: I0320 07:17:09.144357 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7xc9" event={"ID":"8faad596-00ed-4982-9f42-2f1a2465098c","Type":"ContainerStarted","Data":"0d5ae3d79e06d7453fd6ee561c7382bb3d872ad722d8e4a54638397bec96701c"} Mar 20 07:17:10 crc kubenswrapper[4749]: I0320 07:17:10.154731 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lq4rb" event={"ID":"937dac41-5afa-495a-9909-1152a419549c","Type":"ContainerStarted","Data":"3ccff66f0c4353028f2babf1b3a14d4ada8c01486b7e7108c6ef6f214490ebfd"} Mar 20 07:17:10 crc kubenswrapper[4749]: I0320 07:17:10.157232 4749 generic.go:334] "Generic (PLEG): container finished" podID="8faad596-00ed-4982-9f42-2f1a2465098c" containerID="0d5ae3d79e06d7453fd6ee561c7382bb3d872ad722d8e4a54638397bec96701c" exitCode=0 Mar 20 07:17:10 crc kubenswrapper[4749]: I0320 07:17:10.157335 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7xc9" event={"ID":"8faad596-00ed-4982-9f42-2f1a2465098c","Type":"ContainerDied","Data":"0d5ae3d79e06d7453fd6ee561c7382bb3d872ad722d8e4a54638397bec96701c"} Mar 20 07:17:10 crc kubenswrapper[4749]: I0320 07:17:10.160108 4749 generic.go:334] "Generic (PLEG): container finished" podID="cd2462fa-d077-4466-8930-6f2e69938c1b" containerID="f40a28d7711a2db4e38e71739316875514979b67697055bad76458e6d4d58f27" exitCode=0 Mar 20 07:17:10 crc kubenswrapper[4749]: I0320 07:17:10.160154 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szw2w" event={"ID":"cd2462fa-d077-4466-8930-6f2e69938c1b","Type":"ContainerDied","Data":"f40a28d7711a2db4e38e71739316875514979b67697055bad76458e6d4d58f27"} Mar 20 07:17:10 crc kubenswrapper[4749]: I0320 07:17:10.208724 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lq4rb" podStartSLOduration=2.261566583 podStartE2EDuration="51.20870121s" podCreationTimestamp="2026-03-20 07:16:19 +0000 UTC" firstStartedPulling="2026-03-20 07:16:20.56323956 +0000 UTC m=+217.112897207" lastFinishedPulling="2026-03-20 07:17:09.510374187 +0000 UTC m=+266.060031834" observedRunningTime="2026-03-20 07:17:10.185089769 +0000 UTC m=+266.734747436" watchObservedRunningTime="2026-03-20 07:17:10.20870121 +0000 UTC m=+266.758358887" Mar 20 07:17:11 crc kubenswrapper[4749]: I0320 07:17:11.167845 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szw2w" event={"ID":"cd2462fa-d077-4466-8930-6f2e69938c1b","Type":"ContainerStarted","Data":"b5fa511364a1f03bcfbc6c9fca4997da92b1fc1a23448675bf6ab27b8d392799"} Mar 20 07:17:11 crc kubenswrapper[4749]: I0320 07:17:11.170996 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7xc9" event={"ID":"8faad596-00ed-4982-9f42-2f1a2465098c","Type":"ContainerStarted","Data":"7f3a1323d1b5bd0dc74dafb6010e8593316a78e287be8f743cad2b14377d198d"} Mar 20 07:17:11 crc kubenswrapper[4749]: I0320 07:17:11.201436 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-szw2w" podStartSLOduration=2.128842401 podStartE2EDuration="53.201406772s" podCreationTimestamp="2026-03-20 07:16:18 +0000 UTC" firstStartedPulling="2026-03-20 07:16:19.50776043 +0000 UTC m=+216.057418077" lastFinishedPulling="2026-03-20 07:17:10.580324801 +0000 UTC m=+267.129982448" observedRunningTime="2026-03-20 07:17:11.19335944 +0000 UTC m=+267.743017107" watchObservedRunningTime="2026-03-20 07:17:11.201406772 +0000 UTC m=+267.751064479" Mar 20 07:17:11 crc kubenswrapper[4749]: I0320 07:17:11.218516 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m7xc9" podStartSLOduration=3.1534333820000002 podStartE2EDuration="53.218501031s" podCreationTimestamp="2026-03-20 07:16:18 +0000 UTC" firstStartedPulling="2026-03-20 07:16:20.573556945 +0000 UTC m=+217.123214592" lastFinishedPulling="2026-03-20 07:17:10.638624594 +0000 UTC m=+267.188282241" observedRunningTime="2026-03-20 07:17:11.215756419 +0000 UTC m=+267.765414066" watchObservedRunningTime="2026-03-20 07:17:11.218501031 +0000 UTC m=+267.768158678" Mar 20 07:17:15 crc kubenswrapper[4749]: I0320 07:17:15.978689 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rv5h9" Mar 20 07:17:16 crc kubenswrapper[4749]: I0320 07:17:16.171544 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9ss5d" Mar 20 07:17:16 crc kubenswrapper[4749]: I0320 07:17:16.171900 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9ss5d" Mar 20 07:17:16 crc kubenswrapper[4749]: I0320 07:17:16.211112 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9ss5d" Mar 20 07:17:16 crc kubenswrapper[4749]: I0320 07:17:16.426950 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x9swj" Mar 20 07:17:16 crc kubenswrapper[4749]: I0320 07:17:16.516060 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bbjcq" Mar 20 07:17:16 crc kubenswrapper[4749]: I0320 07:17:16.556958 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bbjcq" Mar 20 07:17:16 crc kubenswrapper[4749]: I0320 07:17:16.630933 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7"] Mar 20 07:17:16 crc kubenswrapper[4749]: I0320 07:17:16.631218 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" podUID="c2871df9-37c9-45aa-a6f9-af85d64f7615" containerName="controller-manager" containerID="cri-o://eecc02eecdca0b76a1879506f3e3f77c9f964dbc0fc414c78f34eb3098318722" gracePeriod=30 Mar 20 07:17:16 crc kubenswrapper[4749]: I0320 07:17:16.635949 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb"] Mar 20 07:17:16 crc kubenswrapper[4749]: I0320 07:17:16.636185 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb" podUID="6551fdd2-b4de-4478-aa69-c1c2bb35d1b7" containerName="route-controller-manager" containerID="cri-o://96da0f68eb00a3a5cc59f119b7f6dff435df8cc1ef1ec0e0d73cac4bbb0ad58c" gracePeriod=30 Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.204968 4749 generic.go:334] "Generic (PLEG): container finished" podID="6551fdd2-b4de-4478-aa69-c1c2bb35d1b7" containerID="96da0f68eb00a3a5cc59f119b7f6dff435df8cc1ef1ec0e0d73cac4bbb0ad58c" exitCode=0 Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.205049 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb" event={"ID":"6551fdd2-b4de-4478-aa69-c1c2bb35d1b7","Type":"ContainerDied","Data":"96da0f68eb00a3a5cc59f119b7f6dff435df8cc1ef1ec0e0d73cac4bbb0ad58c"} Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.205099 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb" event={"ID":"6551fdd2-b4de-4478-aa69-c1c2bb35d1b7","Type":"ContainerDied","Data":"67dad6306c37b4a373584088fcd873356cd5bbee2fdf9af527223aa7cd2687cc"} Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.205115 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67dad6306c37b4a373584088fcd873356cd5bbee2fdf9af527223aa7cd2687cc" Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.206845 4749 generic.go:334] "Generic (PLEG): container finished" podID="c2871df9-37c9-45aa-a6f9-af85d64f7615" containerID="eecc02eecdca0b76a1879506f3e3f77c9f964dbc0fc414c78f34eb3098318722" exitCode=0 Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.206899 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" event={"ID":"c2871df9-37c9-45aa-a6f9-af85d64f7615","Type":"ContainerDied","Data":"eecc02eecdca0b76a1879506f3e3f77c9f964dbc0fc414c78f34eb3098318722"} Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.240508 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb" Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.242732 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.255216 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9ss5d" Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.399940 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2871df9-37c9-45aa-a6f9-af85d64f7615-config\") pod \"c2871df9-37c9-45aa-a6f9-af85d64f7615\" (UID: \"c2871df9-37c9-45aa-a6f9-af85d64f7615\") " Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.399996 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7-serving-cert\") pod \"6551fdd2-b4de-4478-aa69-c1c2bb35d1b7\" (UID: \"6551fdd2-b4de-4478-aa69-c1c2bb35d1b7\") " Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.400061 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7-client-ca\") pod \"6551fdd2-b4de-4478-aa69-c1c2bb35d1b7\" (UID: \"6551fdd2-b4de-4478-aa69-c1c2bb35d1b7\") " Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.400092 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2871df9-37c9-45aa-a6f9-af85d64f7615-serving-cert\") pod \"c2871df9-37c9-45aa-a6f9-af85d64f7615\" (UID: \"c2871df9-37c9-45aa-a6f9-af85d64f7615\") " Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.400125 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2871df9-37c9-45aa-a6f9-af85d64f7615-proxy-ca-bundles\") pod \"c2871df9-37c9-45aa-a6f9-af85d64f7615\" (UID: \"c2871df9-37c9-45aa-a6f9-af85d64f7615\") " Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.400181 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rnbq\" (UniqueName: \"kubernetes.io/projected/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7-kube-api-access-4rnbq\") pod \"6551fdd2-b4de-4478-aa69-c1c2bb35d1b7\" (UID: \"6551fdd2-b4de-4478-aa69-c1c2bb35d1b7\") " Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.400222 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnspv\" (UniqueName: \"kubernetes.io/projected/c2871df9-37c9-45aa-a6f9-af85d64f7615-kube-api-access-vnspv\") pod \"c2871df9-37c9-45aa-a6f9-af85d64f7615\" (UID: \"c2871df9-37c9-45aa-a6f9-af85d64f7615\") " Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.400243 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7-config\") pod \"6551fdd2-b4de-4478-aa69-c1c2bb35d1b7\" (UID: \"6551fdd2-b4de-4478-aa69-c1c2bb35d1b7\") " Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.400311 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2871df9-37c9-45aa-a6f9-af85d64f7615-client-ca\") pod \"c2871df9-37c9-45aa-a6f9-af85d64f7615\" (UID: \"c2871df9-37c9-45aa-a6f9-af85d64f7615\") " Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.401155 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2871df9-37c9-45aa-a6f9-af85d64f7615-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c2871df9-37c9-45aa-a6f9-af85d64f7615" (UID: "c2871df9-37c9-45aa-a6f9-af85d64f7615"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.401690 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7-client-ca" (OuterVolumeSpecName: "client-ca") pod "6551fdd2-b4de-4478-aa69-c1c2bb35d1b7" (UID: "6551fdd2-b4de-4478-aa69-c1c2bb35d1b7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.401726 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2871df9-37c9-45aa-a6f9-af85d64f7615-config" (OuterVolumeSpecName: "config") pod "c2871df9-37c9-45aa-a6f9-af85d64f7615" (UID: "c2871df9-37c9-45aa-a6f9-af85d64f7615"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.401741 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2871df9-37c9-45aa-a6f9-af85d64f7615-client-ca" (OuterVolumeSpecName: "client-ca") pod "c2871df9-37c9-45aa-a6f9-af85d64f7615" (UID: "c2871df9-37c9-45aa-a6f9-af85d64f7615"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.401759 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7-config" (OuterVolumeSpecName: "config") pod "6551fdd2-b4de-4478-aa69-c1c2bb35d1b7" (UID: "6551fdd2-b4de-4478-aa69-c1c2bb35d1b7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.406500 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2871df9-37c9-45aa-a6f9-af85d64f7615-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c2871df9-37c9-45aa-a6f9-af85d64f7615" (UID: "c2871df9-37c9-45aa-a6f9-af85d64f7615"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.406606 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6551fdd2-b4de-4478-aa69-c1c2bb35d1b7" (UID: "6551fdd2-b4de-4478-aa69-c1c2bb35d1b7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.406656 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7-kube-api-access-4rnbq" (OuterVolumeSpecName: "kube-api-access-4rnbq") pod "6551fdd2-b4de-4478-aa69-c1c2bb35d1b7" (UID: "6551fdd2-b4de-4478-aa69-c1c2bb35d1b7"). InnerVolumeSpecName "kube-api-access-4rnbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.407378 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2871df9-37c9-45aa-a6f9-af85d64f7615-kube-api-access-vnspv" (OuterVolumeSpecName: "kube-api-access-vnspv") pod "c2871df9-37c9-45aa-a6f9-af85d64f7615" (UID: "c2871df9-37c9-45aa-a6f9-af85d64f7615"). InnerVolumeSpecName "kube-api-access-vnspv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.501837 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnspv\" (UniqueName: \"kubernetes.io/projected/c2871df9-37c9-45aa-a6f9-af85d64f7615-kube-api-access-vnspv\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.501880 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.501895 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2871df9-37c9-45aa-a6f9-af85d64f7615-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.501908 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2871df9-37c9-45aa-a6f9-af85d64f7615-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.501919 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.501928 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.501937 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2871df9-37c9-45aa-a6f9-af85d64f7615-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.501946 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2871df9-37c9-45aa-a6f9-af85d64f7615-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:17 crc kubenswrapper[4749]: I0320 07:17:17.501957 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rnbq\" (UniqueName: \"kubernetes.io/projected/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7-kube-api-access-4rnbq\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.064387 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bbjcq"] Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.067595 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg"] Mar 20 07:17:18 crc kubenswrapper[4749]: E0320 07:17:18.068039 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2871df9-37c9-45aa-a6f9-af85d64f7615" containerName="controller-manager" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.068080 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2871df9-37c9-45aa-a6f9-af85d64f7615" containerName="controller-manager" Mar 20 07:17:18 crc kubenswrapper[4749]: E0320 07:17:18.068121 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6551fdd2-b4de-4478-aa69-c1c2bb35d1b7" containerName="route-controller-manager" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.068139 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6551fdd2-b4de-4478-aa69-c1c2bb35d1b7" containerName="route-controller-manager" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.068432 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6551fdd2-b4de-4478-aa69-c1c2bb35d1b7" containerName="route-controller-manager" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.068477 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2871df9-37c9-45aa-a6f9-af85d64f7615" containerName="controller-manager" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.069368 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.071013 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b98466987-56rc8"] Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.072415 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.075002 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b98466987-56rc8"] Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.078532 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg"] Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.212696 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86a83215-d581-4836-852f-721e6ea3db4b-serving-cert\") pod \"controller-manager-5b98466987-56rc8\" (UID: \"86a83215-d581-4836-852f-721e6ea3db4b\") " pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.213016 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86a83215-d581-4836-852f-721e6ea3db4b-client-ca\") pod \"controller-manager-5b98466987-56rc8\" (UID: \"86a83215-d581-4836-852f-721e6ea3db4b\") " pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.213051 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx82b\" (UniqueName: \"kubernetes.io/projected/86a83215-d581-4836-852f-721e6ea3db4b-kube-api-access-gx82b\") pod \"controller-manager-5b98466987-56rc8\" (UID: \"86a83215-d581-4836-852f-721e6ea3db4b\") " pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.213081 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86a83215-d581-4836-852f-721e6ea3db4b-proxy-ca-bundles\") pod \"controller-manager-5b98466987-56rc8\" (UID: \"86a83215-d581-4836-852f-721e6ea3db4b\") " pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.213098 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d46e89a9-b781-454b-8e2d-870e92825114-config\") pod \"route-controller-manager-bb5477fd6-q8ksg\" (UID: \"d46e89a9-b781-454b-8e2d-870e92825114\") " pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.213122 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d46e89a9-b781-454b-8e2d-870e92825114-client-ca\") pod \"route-controller-manager-bb5477fd6-q8ksg\" (UID: \"d46e89a9-b781-454b-8e2d-870e92825114\") " pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.213276 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86a83215-d581-4836-852f-721e6ea3db4b-config\") pod \"controller-manager-5b98466987-56rc8\" (UID: \"86a83215-d581-4836-852f-721e6ea3db4b\") " pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.213437 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lsdv\" (UniqueName: \"kubernetes.io/projected/d46e89a9-b781-454b-8e2d-870e92825114-kube-api-access-9lsdv\") pod \"route-controller-manager-bb5477fd6-q8ksg\" (UID: \"d46e89a9-b781-454b-8e2d-870e92825114\") " pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.213512 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d46e89a9-b781-454b-8e2d-870e92825114-serving-cert\") pod \"route-controller-manager-bb5477fd6-q8ksg\" (UID: \"d46e89a9-b781-454b-8e2d-870e92825114\") " pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.216398 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" event={"ID":"c2871df9-37c9-45aa-a6f9-af85d64f7615","Type":"ContainerDied","Data":"cbc5490bad36590e4af6497c53d4343fdadb78835df1d22661123308a537edf7"} Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.216452 4749 scope.go:117] "RemoveContainer" containerID="eecc02eecdca0b76a1879506f3e3f77c9f964dbc0fc414c78f34eb3098318722" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.216545 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bbjcq" podUID="ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0" containerName="registry-server" containerID="cri-o://dc11e25d21efe015b8ce76810d8fef25160703332d24f50f8e4af7449f8e42e3" gracePeriod=2 Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.216560 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.216783 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.250405 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7"] Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.254690 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7c9d8ddb97-k2lf7"] Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.260223 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb"] Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.263028 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b77d896c5-9fgdb"] Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.314504 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86a83215-d581-4836-852f-721e6ea3db4b-config\") pod \"controller-manager-5b98466987-56rc8\" (UID: \"86a83215-d581-4836-852f-721e6ea3db4b\") " pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.314847 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lsdv\" (UniqueName: \"kubernetes.io/projected/d46e89a9-b781-454b-8e2d-870e92825114-kube-api-access-9lsdv\") pod \"route-controller-manager-bb5477fd6-q8ksg\" (UID: \"d46e89a9-b781-454b-8e2d-870e92825114\") " pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.314889 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d46e89a9-b781-454b-8e2d-870e92825114-serving-cert\") pod \"route-controller-manager-bb5477fd6-q8ksg\" (UID: \"d46e89a9-b781-454b-8e2d-870e92825114\") " pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.314961 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86a83215-d581-4836-852f-721e6ea3db4b-serving-cert\") pod \"controller-manager-5b98466987-56rc8\" (UID: \"86a83215-d581-4836-852f-721e6ea3db4b\") " pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.314985 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86a83215-d581-4836-852f-721e6ea3db4b-client-ca\") pod \"controller-manager-5b98466987-56rc8\" (UID: \"86a83215-d581-4836-852f-721e6ea3db4b\") " pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.315016 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx82b\" (UniqueName: \"kubernetes.io/projected/86a83215-d581-4836-852f-721e6ea3db4b-kube-api-access-gx82b\") pod \"controller-manager-5b98466987-56rc8\" (UID: \"86a83215-d581-4836-852f-721e6ea3db4b\") " pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.315041 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86a83215-d581-4836-852f-721e6ea3db4b-proxy-ca-bundles\") pod \"controller-manager-5b98466987-56rc8\" (UID: \"86a83215-d581-4836-852f-721e6ea3db4b\") " pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.315059 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d46e89a9-b781-454b-8e2d-870e92825114-config\") pod \"route-controller-manager-bb5477fd6-q8ksg\" (UID: \"d46e89a9-b781-454b-8e2d-870e92825114\") " pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.315081 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d46e89a9-b781-454b-8e2d-870e92825114-client-ca\") pod \"route-controller-manager-bb5477fd6-q8ksg\" (UID: \"d46e89a9-b781-454b-8e2d-870e92825114\") " pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.316571 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86a83215-d581-4836-852f-721e6ea3db4b-client-ca\") pod \"controller-manager-5b98466987-56rc8\" (UID: \"86a83215-d581-4836-852f-721e6ea3db4b\") " pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.316995 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86a83215-d581-4836-852f-721e6ea3db4b-proxy-ca-bundles\") pod \"controller-manager-5b98466987-56rc8\" (UID: \"86a83215-d581-4836-852f-721e6ea3db4b\") " pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.317194 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d46e89a9-b781-454b-8e2d-870e92825114-config\") pod \"route-controller-manager-bb5477fd6-q8ksg\" (UID: \"d46e89a9-b781-454b-8e2d-870e92825114\") " pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.318958 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d46e89a9-b781-454b-8e2d-870e92825114-client-ca\") pod \"route-controller-manager-bb5477fd6-q8ksg\" (UID: \"d46e89a9-b781-454b-8e2d-870e92825114\") " pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.319854 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86a83215-d581-4836-852f-721e6ea3db4b-config\") pod \"controller-manager-5b98466987-56rc8\" (UID: \"86a83215-d581-4836-852f-721e6ea3db4b\") " pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.330989 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86a83215-d581-4836-852f-721e6ea3db4b-serving-cert\") pod \"controller-manager-5b98466987-56rc8\" (UID: \"86a83215-d581-4836-852f-721e6ea3db4b\") " pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.334337 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d46e89a9-b781-454b-8e2d-870e92825114-serving-cert\") pod \"route-controller-manager-bb5477fd6-q8ksg\" (UID: \"d46e89a9-b781-454b-8e2d-870e92825114\") " pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.341579 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lsdv\" (UniqueName: \"kubernetes.io/projected/d46e89a9-b781-454b-8e2d-870e92825114-kube-api-access-9lsdv\") pod \"route-controller-manager-bb5477fd6-q8ksg\" (UID: \"d46e89a9-b781-454b-8e2d-870e92825114\") " pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.348539 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx82b\" (UniqueName: \"kubernetes.io/projected/86a83215-d581-4836-852f-721e6ea3db4b-kube-api-access-gx82b\") pod \"controller-manager-5b98466987-56rc8\" (UID: \"86a83215-d581-4836-852f-721e6ea3db4b\") " pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.402005 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.416393 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.471591 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-szw2w" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.471640 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-szw2w" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.523076 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-szw2w" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.657860 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x9swj"] Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.658099 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x9swj" podUID="a22f47dc-59ce-4cce-821c-508fc14a9508" containerName="registry-server" containerID="cri-o://bbc52864abe0fe9cf380b57e4493cf00807794a3926cfa0edc36578936ac2d49" gracePeriod=2 Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.708986 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbjcq" Mar 20 07:17:18 crc kubenswrapper[4749]: I0320 07:17:18.742498 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b98466987-56rc8"] Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:18.826934 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckwfm\" (UniqueName: \"kubernetes.io/projected/ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0-kube-api-access-ckwfm\") pod \"ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0\" (UID: \"ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0\") " Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:18.827036 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0-utilities\") pod \"ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0\" (UID: \"ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0\") " Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:18.827063 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0-catalog-content\") pod \"ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0\" (UID: \"ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0\") " Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:18.827858 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0-utilities" (OuterVolumeSpecName: "utilities") pod "ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0" (UID: "ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:18.879833 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0" (UID: "ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:18.928214 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:18.928252 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.005879 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0-kube-api-access-ckwfm" (OuterVolumeSpecName: "kube-api-access-ckwfm") pod "ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0" (UID: "ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0"). InnerVolumeSpecName "kube-api-access-ckwfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.029839 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckwfm\" (UniqueName: \"kubernetes.io/projected/ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0-kube-api-access-ckwfm\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.041807 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg"] Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.074461 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-t5b5l"] Mar 20 07:17:19 crc kubenswrapper[4749]: W0320 07:17:19.114130 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd46e89a9_b781_454b_8e2d_870e92825114.slice/crio-5b9e5c6248f7dfba8a14f26d35d2390b728cb4ea98c5b94b8a0b99dfd0e4aadb WatchSource:0}: Error finding container 5b9e5c6248f7dfba8a14f26d35d2390b728cb4ea98c5b94b8a0b99dfd0e4aadb: Status 404 returned error can't find the container with id 5b9e5c6248f7dfba8a14f26d35d2390b728cb4ea98c5b94b8a0b99dfd0e4aadb Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.171712 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m7xc9" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.171771 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m7xc9" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.224461 4749 generic.go:334] "Generic (PLEG): container finished" podID="ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0" containerID="dc11e25d21efe015b8ce76810d8fef25160703332d24f50f8e4af7449f8e42e3" exitCode=0 Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.224525 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbjcq" event={"ID":"ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0","Type":"ContainerDied","Data":"dc11e25d21efe015b8ce76810d8fef25160703332d24f50f8e4af7449f8e42e3"} Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.224553 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbjcq" event={"ID":"ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0","Type":"ContainerDied","Data":"3423a7f9f90c2055805b9647f76845cb1c3e89ec49178df9b4556f1322008e29"} Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.224569 4749 scope.go:117] "RemoveContainer" containerID="dc11e25d21efe015b8ce76810d8fef25160703332d24f50f8e4af7449f8e42e3" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.224685 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbjcq" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.227595 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" event={"ID":"86a83215-d581-4836-852f-721e6ea3db4b","Type":"ContainerStarted","Data":"b1fe5fa978d12000e7b0e8e395283f666351c9f498a0799e64fb46b4bbf8f50e"} Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.231530 4749 generic.go:334] "Generic (PLEG): container finished" podID="a22f47dc-59ce-4cce-821c-508fc14a9508" containerID="bbc52864abe0fe9cf380b57e4493cf00807794a3926cfa0edc36578936ac2d49" exitCode=0 Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.231659 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9swj" event={"ID":"a22f47dc-59ce-4cce-821c-508fc14a9508","Type":"ContainerDied","Data":"bbc52864abe0fe9cf380b57e4493cf00807794a3926cfa0edc36578936ac2d49"} Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.234127 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" event={"ID":"d46e89a9-b781-454b-8e2d-870e92825114","Type":"ContainerStarted","Data":"5b9e5c6248f7dfba8a14f26d35d2390b728cb4ea98c5b94b8a0b99dfd0e4aadb"} Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.244865 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m7xc9" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.245849 4749 scope.go:117] "RemoveContainer" containerID="2fa089fdfd234e5fbbd1f1a90f13d8163316131f2e222fbc6b2f37e8e477ace2" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.256391 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bbjcq"] Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.264186 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bbjcq"] Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.271812 4749 scope.go:117] "RemoveContainer" containerID="230e7266060294fea9bd12b211aabca3ce92e4a4179d32772a8e624a999db801" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.308327 4749 scope.go:117] "RemoveContainer" containerID="dc11e25d21efe015b8ce76810d8fef25160703332d24f50f8e4af7449f8e42e3" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.316761 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-szw2w" Mar 20 07:17:19 crc kubenswrapper[4749]: E0320 07:17:19.316820 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc11e25d21efe015b8ce76810d8fef25160703332d24f50f8e4af7449f8e42e3\": container with ID starting with dc11e25d21efe015b8ce76810d8fef25160703332d24f50f8e4af7449f8e42e3 not found: ID does not exist" containerID="dc11e25d21efe015b8ce76810d8fef25160703332d24f50f8e4af7449f8e42e3" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.316850 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m7xc9" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.316844 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc11e25d21efe015b8ce76810d8fef25160703332d24f50f8e4af7449f8e42e3"} err="failed to get container status \"dc11e25d21efe015b8ce76810d8fef25160703332d24f50f8e4af7449f8e42e3\": rpc error: code = NotFound desc = could not find container \"dc11e25d21efe015b8ce76810d8fef25160703332d24f50f8e4af7449f8e42e3\": container with ID starting with dc11e25d21efe015b8ce76810d8fef25160703332d24f50f8e4af7449f8e42e3 not found: ID does not exist" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.316900 4749 scope.go:117] "RemoveContainer" containerID="2fa089fdfd234e5fbbd1f1a90f13d8163316131f2e222fbc6b2f37e8e477ace2" Mar 20 07:17:19 crc kubenswrapper[4749]: E0320 07:17:19.317225 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fa089fdfd234e5fbbd1f1a90f13d8163316131f2e222fbc6b2f37e8e477ace2\": container with ID starting with 2fa089fdfd234e5fbbd1f1a90f13d8163316131f2e222fbc6b2f37e8e477ace2 not found: ID does not exist" containerID="2fa089fdfd234e5fbbd1f1a90f13d8163316131f2e222fbc6b2f37e8e477ace2" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.317258 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fa089fdfd234e5fbbd1f1a90f13d8163316131f2e222fbc6b2f37e8e477ace2"} err="failed to get container status \"2fa089fdfd234e5fbbd1f1a90f13d8163316131f2e222fbc6b2f37e8e477ace2\": rpc error: code = NotFound desc = could not find container \"2fa089fdfd234e5fbbd1f1a90f13d8163316131f2e222fbc6b2f37e8e477ace2\": container with ID starting with 2fa089fdfd234e5fbbd1f1a90f13d8163316131f2e222fbc6b2f37e8e477ace2 not found: ID does not exist" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.317322 4749 scope.go:117] "RemoveContainer" containerID="230e7266060294fea9bd12b211aabca3ce92e4a4179d32772a8e624a999db801" Mar 20 07:17:19 crc kubenswrapper[4749]: E0320 07:17:19.318997 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"230e7266060294fea9bd12b211aabca3ce92e4a4179d32772a8e624a999db801\": container with ID starting with 230e7266060294fea9bd12b211aabca3ce92e4a4179d32772a8e624a999db801 not found: ID does not exist" containerID="230e7266060294fea9bd12b211aabca3ce92e4a4179d32772a8e624a999db801" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.319021 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"230e7266060294fea9bd12b211aabca3ce92e4a4179d32772a8e624a999db801"} err="failed to get container status \"230e7266060294fea9bd12b211aabca3ce92e4a4179d32772a8e624a999db801\": rpc error: code = NotFound desc = could not find container \"230e7266060294fea9bd12b211aabca3ce92e4a4179d32772a8e624a999db801\": container with ID starting with 230e7266060294fea9bd12b211aabca3ce92e4a4179d32772a8e624a999db801 not found: ID does not exist" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.478901 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lq4rb" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.479231 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lq4rb" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.523825 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lq4rb" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.935006 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x9swj" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.938238 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a22f47dc-59ce-4cce-821c-508fc14a9508-utilities\") pod \"a22f47dc-59ce-4cce-821c-508fc14a9508\" (UID: \"a22f47dc-59ce-4cce-821c-508fc14a9508\") " Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.939093 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk8jn\" (UniqueName: \"kubernetes.io/projected/a22f47dc-59ce-4cce-821c-508fc14a9508-kube-api-access-hk8jn\") pod \"a22f47dc-59ce-4cce-821c-508fc14a9508\" (UID: \"a22f47dc-59ce-4cce-821c-508fc14a9508\") " Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.939229 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a22f47dc-59ce-4cce-821c-508fc14a9508-catalog-content\") pod \"a22f47dc-59ce-4cce-821c-508fc14a9508\" (UID: \"a22f47dc-59ce-4cce-821c-508fc14a9508\") " Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.939052 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a22f47dc-59ce-4cce-821c-508fc14a9508-utilities" (OuterVolumeSpecName: "utilities") pod "a22f47dc-59ce-4cce-821c-508fc14a9508" (UID: "a22f47dc-59ce-4cce-821c-508fc14a9508"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.945517 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a22f47dc-59ce-4cce-821c-508fc14a9508-kube-api-access-hk8jn" (OuterVolumeSpecName: "kube-api-access-hk8jn") pod "a22f47dc-59ce-4cce-821c-508fc14a9508" (UID: "a22f47dc-59ce-4cce-821c-508fc14a9508"). InnerVolumeSpecName "kube-api-access-hk8jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:17:19 crc kubenswrapper[4749]: I0320 07:17:19.997615 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a22f47dc-59ce-4cce-821c-508fc14a9508-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a22f47dc-59ce-4cce-821c-508fc14a9508" (UID: "a22f47dc-59ce-4cce-821c-508fc14a9508"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:17:20 crc kubenswrapper[4749]: I0320 07:17:20.044175 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a22f47dc-59ce-4cce-821c-508fc14a9508-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:20 crc kubenswrapper[4749]: I0320 07:17:20.044208 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk8jn\" (UniqueName: \"kubernetes.io/projected/a22f47dc-59ce-4cce-821c-508fc14a9508-kube-api-access-hk8jn\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:20 crc kubenswrapper[4749]: I0320 07:17:20.044219 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a22f47dc-59ce-4cce-821c-508fc14a9508-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:20 crc kubenswrapper[4749]: I0320 07:17:20.193372 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6551fdd2-b4de-4478-aa69-c1c2bb35d1b7" path="/var/lib/kubelet/pods/6551fdd2-b4de-4478-aa69-c1c2bb35d1b7/volumes" Mar 20 07:17:20 crc kubenswrapper[4749]: I0320 07:17:20.194780 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2871df9-37c9-45aa-a6f9-af85d64f7615" path="/var/lib/kubelet/pods/c2871df9-37c9-45aa-a6f9-af85d64f7615/volumes" Mar 20 07:17:20 crc kubenswrapper[4749]: I0320 07:17:20.195929 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0" path="/var/lib/kubelet/pods/ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0/volumes" Mar 20 07:17:20 crc kubenswrapper[4749]: I0320 07:17:20.247698 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9swj" event={"ID":"a22f47dc-59ce-4cce-821c-508fc14a9508","Type":"ContainerDied","Data":"1c432bbc90f2ad3798c358ce4651b7f7d73cfbc3b8eef9d4ead6ab769c504ced"} Mar 20 07:17:20 crc kubenswrapper[4749]: I0320 07:17:20.247734 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x9swj" Mar 20 07:17:20 crc kubenswrapper[4749]: I0320 07:17:20.247772 4749 scope.go:117] "RemoveContainer" containerID="bbc52864abe0fe9cf380b57e4493cf00807794a3926cfa0edc36578936ac2d49" Mar 20 07:17:20 crc kubenswrapper[4749]: I0320 07:17:20.251914 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" event={"ID":"86a83215-d581-4836-852f-721e6ea3db4b","Type":"ContainerStarted","Data":"416da31e2aee33a1da62f4492767576e390cb15bd3b2f391a2d0a73cf3689f14"} Mar 20 07:17:20 crc kubenswrapper[4749]: I0320 07:17:20.252349 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" Mar 20 07:17:20 crc kubenswrapper[4749]: I0320 07:17:20.254301 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" event={"ID":"d46e89a9-b781-454b-8e2d-870e92825114","Type":"ContainerStarted","Data":"de46bf5d8c78b17022e1b952edea175cc0584ed8eccc97d3514f470ddc796907"} Mar 20 07:17:20 crc kubenswrapper[4749]: I0320 07:17:20.254978 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" Mar 20 07:17:20 crc kubenswrapper[4749]: I0320 07:17:20.266171 4749 scope.go:117] "RemoveContainer" containerID="4b3e98598ca0753493c27401b8ab35e23d717b69b5e00e34418ea0ffae39cd7f" Mar 20 07:17:20 crc kubenswrapper[4749]: I0320 07:17:20.266685 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" Mar 20 07:17:20 crc kubenswrapper[4749]: I0320 07:17:20.267322 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" Mar 20 07:17:20 crc kubenswrapper[4749]: I0320 07:17:20.276154 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" podStartSLOduration=4.276126552 podStartE2EDuration="4.276126552s" podCreationTimestamp="2026-03-20 07:17:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:17:20.273631526 +0000 UTC m=+276.823289213" watchObservedRunningTime="2026-03-20 07:17:20.276126552 +0000 UTC m=+276.825784239" Mar 20 07:17:20 crc kubenswrapper[4749]: I0320 07:17:20.306644 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lq4rb" Mar 20 07:17:20 crc kubenswrapper[4749]: I0320 07:17:20.311807 4749 scope.go:117] "RemoveContainer" containerID="6c42fd191ad1d44d691538c5f9f060bd726611fc36cd413387807f41709e1035" Mar 20 07:17:20 crc kubenswrapper[4749]: I0320 07:17:20.335340 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x9swj"] Mar 20 07:17:20 crc kubenswrapper[4749]: I0320 07:17:20.339862 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x9swj"] Mar 20 07:17:20 crc kubenswrapper[4749]: I0320 07:17:20.348302 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" podStartSLOduration=4.348269259 podStartE2EDuration="4.348269259s" podCreationTimestamp="2026-03-20 07:17:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:17:20.34756111 +0000 UTC m=+276.897218747" watchObservedRunningTime="2026-03-20 07:17:20.348269259 +0000 UTC m=+276.897926906" Mar 20 07:17:21 crc kubenswrapper[4749]: I0320 07:17:21.064081 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-szw2w"] Mar 20 07:17:21 crc kubenswrapper[4749]: I0320 07:17:21.262375 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-szw2w" podUID="cd2462fa-d077-4466-8930-6f2e69938c1b" containerName="registry-server" containerID="cri-o://b5fa511364a1f03bcfbc6c9fca4997da92b1fc1a23448675bf6ab27b8d392799" gracePeriod=2 Mar 20 07:17:21 crc kubenswrapper[4749]: I0320 07:17:21.686317 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szw2w" Mar 20 07:17:21 crc kubenswrapper[4749]: I0320 07:17:21.769838 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txjch\" (UniqueName: \"kubernetes.io/projected/cd2462fa-d077-4466-8930-6f2e69938c1b-kube-api-access-txjch\") pod \"cd2462fa-d077-4466-8930-6f2e69938c1b\" (UID: \"cd2462fa-d077-4466-8930-6f2e69938c1b\") " Mar 20 07:17:21 crc kubenswrapper[4749]: I0320 07:17:21.769935 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd2462fa-d077-4466-8930-6f2e69938c1b-utilities\") pod \"cd2462fa-d077-4466-8930-6f2e69938c1b\" (UID: \"cd2462fa-d077-4466-8930-6f2e69938c1b\") " Mar 20 07:17:21 crc kubenswrapper[4749]: I0320 07:17:21.770004 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd2462fa-d077-4466-8930-6f2e69938c1b-catalog-content\") pod \"cd2462fa-d077-4466-8930-6f2e69938c1b\" (UID: \"cd2462fa-d077-4466-8930-6f2e69938c1b\") " Mar 20 07:17:21 crc kubenswrapper[4749]: I0320 07:17:21.771110 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd2462fa-d077-4466-8930-6f2e69938c1b-utilities" (OuterVolumeSpecName: "utilities") pod "cd2462fa-d077-4466-8930-6f2e69938c1b" (UID: "cd2462fa-d077-4466-8930-6f2e69938c1b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:17:21 crc kubenswrapper[4749]: I0320 07:17:21.794668 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd2462fa-d077-4466-8930-6f2e69938c1b-kube-api-access-txjch" (OuterVolumeSpecName: "kube-api-access-txjch") pod "cd2462fa-d077-4466-8930-6f2e69938c1b" (UID: "cd2462fa-d077-4466-8930-6f2e69938c1b"). InnerVolumeSpecName "kube-api-access-txjch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:17:21 crc kubenswrapper[4749]: I0320 07:17:21.833405 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd2462fa-d077-4466-8930-6f2e69938c1b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd2462fa-d077-4466-8930-6f2e69938c1b" (UID: "cd2462fa-d077-4466-8930-6f2e69938c1b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:17:21 crc kubenswrapper[4749]: I0320 07:17:21.871022 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txjch\" (UniqueName: \"kubernetes.io/projected/cd2462fa-d077-4466-8930-6f2e69938c1b-kube-api-access-txjch\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:21 crc kubenswrapper[4749]: I0320 07:17:21.871071 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd2462fa-d077-4466-8930-6f2e69938c1b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:21 crc kubenswrapper[4749]: I0320 07:17:21.871085 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd2462fa-d077-4466-8930-6f2e69938c1b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:22 crc kubenswrapper[4749]: I0320 07:17:22.189363 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a22f47dc-59ce-4cce-821c-508fc14a9508" path="/var/lib/kubelet/pods/a22f47dc-59ce-4cce-821c-508fc14a9508/volumes" Mar 20 07:17:22 crc kubenswrapper[4749]: I0320 07:17:22.272960 4749 generic.go:334] "Generic (PLEG): container finished" podID="cd2462fa-d077-4466-8930-6f2e69938c1b" containerID="b5fa511364a1f03bcfbc6c9fca4997da92b1fc1a23448675bf6ab27b8d392799" exitCode=0 Mar 20 07:17:22 crc kubenswrapper[4749]: I0320 07:17:22.273014 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szw2w" Mar 20 07:17:22 crc kubenswrapper[4749]: I0320 07:17:22.273057 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szw2w" event={"ID":"cd2462fa-d077-4466-8930-6f2e69938c1b","Type":"ContainerDied","Data":"b5fa511364a1f03bcfbc6c9fca4997da92b1fc1a23448675bf6ab27b8d392799"} Mar 20 07:17:22 crc kubenswrapper[4749]: I0320 07:17:22.273128 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szw2w" event={"ID":"cd2462fa-d077-4466-8930-6f2e69938c1b","Type":"ContainerDied","Data":"1bceb1f83fc2093b7e14b4c6ad527ca3b6aed59db619c4cd190ab0f3b9a635b2"} Mar 20 07:17:22 crc kubenswrapper[4749]: I0320 07:17:22.273192 4749 scope.go:117] "RemoveContainer" containerID="b5fa511364a1f03bcfbc6c9fca4997da92b1fc1a23448675bf6ab27b8d392799" Mar 20 07:17:22 crc kubenswrapper[4749]: I0320 07:17:22.295610 4749 scope.go:117] "RemoveContainer" containerID="f40a28d7711a2db4e38e71739316875514979b67697055bad76458e6d4d58f27" Mar 20 07:17:22 crc kubenswrapper[4749]: I0320 07:17:22.300641 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-szw2w"] Mar 20 07:17:22 crc kubenswrapper[4749]: I0320 07:17:22.309114 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-szw2w"] Mar 20 07:17:22 crc kubenswrapper[4749]: I0320 07:17:22.318949 4749 scope.go:117] "RemoveContainer" containerID="a1a682f4a82cd43727b79d814e9dfa97798f75d71d9d6a9c3f23f09fdceb5da8" Mar 20 07:17:22 crc kubenswrapper[4749]: I0320 07:17:22.352074 4749 scope.go:117] "RemoveContainer" containerID="b5fa511364a1f03bcfbc6c9fca4997da92b1fc1a23448675bf6ab27b8d392799" Mar 20 07:17:22 crc kubenswrapper[4749]: E0320 07:17:22.352829 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5fa511364a1f03bcfbc6c9fca4997da92b1fc1a23448675bf6ab27b8d392799\": container with ID starting with b5fa511364a1f03bcfbc6c9fca4997da92b1fc1a23448675bf6ab27b8d392799 not found: ID does not exist" containerID="b5fa511364a1f03bcfbc6c9fca4997da92b1fc1a23448675bf6ab27b8d392799" Mar 20 07:17:22 crc kubenswrapper[4749]: I0320 07:17:22.352905 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5fa511364a1f03bcfbc6c9fca4997da92b1fc1a23448675bf6ab27b8d392799"} err="failed to get container status \"b5fa511364a1f03bcfbc6c9fca4997da92b1fc1a23448675bf6ab27b8d392799\": rpc error: code = NotFound desc = could not find container \"b5fa511364a1f03bcfbc6c9fca4997da92b1fc1a23448675bf6ab27b8d392799\": container with ID starting with b5fa511364a1f03bcfbc6c9fca4997da92b1fc1a23448675bf6ab27b8d392799 not found: ID does not exist" Mar 20 07:17:22 crc kubenswrapper[4749]: I0320 07:17:22.352951 4749 scope.go:117] "RemoveContainer" containerID="f40a28d7711a2db4e38e71739316875514979b67697055bad76458e6d4d58f27" Mar 20 07:17:22 crc kubenswrapper[4749]: E0320 07:17:22.353807 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f40a28d7711a2db4e38e71739316875514979b67697055bad76458e6d4d58f27\": container with ID starting with f40a28d7711a2db4e38e71739316875514979b67697055bad76458e6d4d58f27 not found: ID does not exist" containerID="f40a28d7711a2db4e38e71739316875514979b67697055bad76458e6d4d58f27" Mar 20 07:17:22 crc kubenswrapper[4749]: I0320 07:17:22.353852 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f40a28d7711a2db4e38e71739316875514979b67697055bad76458e6d4d58f27"} err="failed to get container status \"f40a28d7711a2db4e38e71739316875514979b67697055bad76458e6d4d58f27\": rpc error: code = NotFound desc = could not find container \"f40a28d7711a2db4e38e71739316875514979b67697055bad76458e6d4d58f27\": container with ID starting with f40a28d7711a2db4e38e71739316875514979b67697055bad76458e6d4d58f27 not found: ID does not exist" Mar 20 07:17:22 crc kubenswrapper[4749]: I0320 07:17:22.353883 4749 scope.go:117] "RemoveContainer" containerID="a1a682f4a82cd43727b79d814e9dfa97798f75d71d9d6a9c3f23f09fdceb5da8" Mar 20 07:17:22 crc kubenswrapper[4749]: E0320 07:17:22.354489 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1a682f4a82cd43727b79d814e9dfa97798f75d71d9d6a9c3f23f09fdceb5da8\": container with ID starting with a1a682f4a82cd43727b79d814e9dfa97798f75d71d9d6a9c3f23f09fdceb5da8 not found: ID does not exist" containerID="a1a682f4a82cd43727b79d814e9dfa97798f75d71d9d6a9c3f23f09fdceb5da8" Mar 20 07:17:22 crc kubenswrapper[4749]: I0320 07:17:22.354548 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1a682f4a82cd43727b79d814e9dfa97798f75d71d9d6a9c3f23f09fdceb5da8"} err="failed to get container status \"a1a682f4a82cd43727b79d814e9dfa97798f75d71d9d6a9c3f23f09fdceb5da8\": rpc error: code = NotFound desc = could not find container \"a1a682f4a82cd43727b79d814e9dfa97798f75d71d9d6a9c3f23f09fdceb5da8\": container with ID starting with a1a682f4a82cd43727b79d814e9dfa97798f75d71d9d6a9c3f23f09fdceb5da8 not found: ID does not exist" Mar 20 07:17:23 crc kubenswrapper[4749]: I0320 07:17:23.461866 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lq4rb"] Mar 20 07:17:23 crc kubenswrapper[4749]: I0320 07:17:23.462621 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lq4rb" podUID="937dac41-5afa-495a-9909-1152a419549c" containerName="registry-server" containerID="cri-o://3ccff66f0c4353028f2babf1b3a14d4ada8c01486b7e7108c6ef6f214490ebfd" gracePeriod=2 Mar 20 07:17:24 crc kubenswrapper[4749]: I0320 07:17:24.190748 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd2462fa-d077-4466-8930-6f2e69938c1b" path="/var/lib/kubelet/pods/cd2462fa-d077-4466-8930-6f2e69938c1b/volumes" Mar 20 07:17:24 crc kubenswrapper[4749]: I0320 07:17:24.292566 4749 generic.go:334] "Generic (PLEG): container finished" podID="937dac41-5afa-495a-9909-1152a419549c" containerID="3ccff66f0c4353028f2babf1b3a14d4ada8c01486b7e7108c6ef6f214490ebfd" exitCode=0 Mar 20 07:17:24 crc kubenswrapper[4749]: I0320 07:17:24.292644 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lq4rb" event={"ID":"937dac41-5afa-495a-9909-1152a419549c","Type":"ContainerDied","Data":"3ccff66f0c4353028f2babf1b3a14d4ada8c01486b7e7108c6ef6f214490ebfd"} Mar 20 07:17:24 crc kubenswrapper[4749]: I0320 07:17:24.593871 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lq4rb" Mar 20 07:17:24 crc kubenswrapper[4749]: I0320 07:17:24.610921 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlzhl\" (UniqueName: \"kubernetes.io/projected/937dac41-5afa-495a-9909-1152a419549c-kube-api-access-rlzhl\") pod \"937dac41-5afa-495a-9909-1152a419549c\" (UID: \"937dac41-5afa-495a-9909-1152a419549c\") " Mar 20 07:17:24 crc kubenswrapper[4749]: I0320 07:17:24.611301 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/937dac41-5afa-495a-9909-1152a419549c-utilities\") pod \"937dac41-5afa-495a-9909-1152a419549c\" (UID: \"937dac41-5afa-495a-9909-1152a419549c\") " Mar 20 07:17:24 crc kubenswrapper[4749]: I0320 07:17:24.611419 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/937dac41-5afa-495a-9909-1152a419549c-catalog-content\") pod \"937dac41-5afa-495a-9909-1152a419549c\" (UID: \"937dac41-5afa-495a-9909-1152a419549c\") " Mar 20 07:17:24 crc kubenswrapper[4749]: I0320 07:17:24.615615 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/937dac41-5afa-495a-9909-1152a419549c-utilities" (OuterVolumeSpecName: "utilities") pod "937dac41-5afa-495a-9909-1152a419549c" (UID: "937dac41-5afa-495a-9909-1152a419549c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:17:24 crc kubenswrapper[4749]: I0320 07:17:24.619491 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/937dac41-5afa-495a-9909-1152a419549c-kube-api-access-rlzhl" (OuterVolumeSpecName: "kube-api-access-rlzhl") pod "937dac41-5afa-495a-9909-1152a419549c" (UID: "937dac41-5afa-495a-9909-1152a419549c"). InnerVolumeSpecName "kube-api-access-rlzhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:17:24 crc kubenswrapper[4749]: I0320 07:17:24.713117 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlzhl\" (UniqueName: \"kubernetes.io/projected/937dac41-5afa-495a-9909-1152a419549c-kube-api-access-rlzhl\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:24 crc kubenswrapper[4749]: I0320 07:17:24.713145 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/937dac41-5afa-495a-9909-1152a419549c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:24 crc kubenswrapper[4749]: I0320 07:17:24.748965 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/937dac41-5afa-495a-9909-1152a419549c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "937dac41-5afa-495a-9909-1152a419549c" (UID: "937dac41-5afa-495a-9909-1152a419549c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:17:24 crc kubenswrapper[4749]: I0320 07:17:24.814358 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/937dac41-5afa-495a-9909-1152a419549c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:25 crc kubenswrapper[4749]: I0320 07:17:25.304361 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lq4rb" event={"ID":"937dac41-5afa-495a-9909-1152a419549c","Type":"ContainerDied","Data":"808123ecdcdbc205854608f4e17e2661ef6666efba995198c0ef116a48cfaf1f"} Mar 20 07:17:25 crc kubenswrapper[4749]: I0320 07:17:25.304444 4749 scope.go:117] "RemoveContainer" containerID="3ccff66f0c4353028f2babf1b3a14d4ada8c01486b7e7108c6ef6f214490ebfd" Mar 20 07:17:25 crc kubenswrapper[4749]: I0320 07:17:25.304647 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lq4rb" Mar 20 07:17:25 crc kubenswrapper[4749]: I0320 07:17:25.327875 4749 scope.go:117] "RemoveContainer" containerID="d92f9d3e4b5049a11264ba7179af1895262839ee476ac245385ea2a25dbab7f2" Mar 20 07:17:25 crc kubenswrapper[4749]: I0320 07:17:25.357624 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lq4rb"] Mar 20 07:17:25 crc kubenswrapper[4749]: I0320 07:17:25.363234 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lq4rb"] Mar 20 07:17:25 crc kubenswrapper[4749]: I0320 07:17:25.369036 4749 scope.go:117] "RemoveContainer" containerID="923ff68fbc0eeabcf3046d61cc602be29815801953a9f68cb5bd2df14a94ab7f" Mar 20 07:17:26 crc kubenswrapper[4749]: I0320 07:17:26.187011 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="937dac41-5afa-495a-9909-1152a419549c" path="/var/lib/kubelet/pods/937dac41-5afa-495a-9909-1152a419549c/volumes" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.642459 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b98466987-56rc8"] Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.643426 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" podUID="86a83215-d581-4836-852f-721e6ea3db4b" containerName="controller-manager" containerID="cri-o://416da31e2aee33a1da62f4492767576e390cb15bd3b2f391a2d0a73cf3689f14" gracePeriod=30 Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.733999 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg"] Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.734243 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" podUID="d46e89a9-b781-454b-8e2d-870e92825114" containerName="route-controller-manager" containerID="cri-o://de46bf5d8c78b17022e1b952edea175cc0584ed8eccc97d3514f470ddc796907" gracePeriod=30 Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.958974 4749 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 07:17:36 crc kubenswrapper[4749]: E0320 07:17:36.959448 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd2462fa-d077-4466-8930-6f2e69938c1b" containerName="extract-content" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.959467 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd2462fa-d077-4466-8930-6f2e69938c1b" containerName="extract-content" Mar 20 07:17:36 crc kubenswrapper[4749]: E0320 07:17:36.959479 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0" containerName="extract-content" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.959485 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0" containerName="extract-content" Mar 20 07:17:36 crc kubenswrapper[4749]: E0320 07:17:36.959498 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd2462fa-d077-4466-8930-6f2e69938c1b" containerName="extract-utilities" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.959505 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd2462fa-d077-4466-8930-6f2e69938c1b" containerName="extract-utilities" Mar 20 07:17:36 crc kubenswrapper[4749]: E0320 07:17:36.959511 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a22f47dc-59ce-4cce-821c-508fc14a9508" containerName="extract-utilities" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.959517 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a22f47dc-59ce-4cce-821c-508fc14a9508" containerName="extract-utilities" Mar 20 07:17:36 crc kubenswrapper[4749]: E0320 07:17:36.959526 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="937dac41-5afa-495a-9909-1152a419549c" containerName="extract-utilities" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.959531 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="937dac41-5afa-495a-9909-1152a419549c" containerName="extract-utilities" Mar 20 07:17:36 crc kubenswrapper[4749]: E0320 07:17:36.959540 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="937dac41-5afa-495a-9909-1152a419549c" containerName="extract-content" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.959547 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="937dac41-5afa-495a-9909-1152a419549c" containerName="extract-content" Mar 20 07:17:36 crc kubenswrapper[4749]: E0320 07:17:36.959556 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0" containerName="registry-server" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.959561 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0" containerName="registry-server" Mar 20 07:17:36 crc kubenswrapper[4749]: E0320 07:17:36.959571 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a22f47dc-59ce-4cce-821c-508fc14a9508" containerName="registry-server" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.959577 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a22f47dc-59ce-4cce-821c-508fc14a9508" containerName="registry-server" Mar 20 07:17:36 crc kubenswrapper[4749]: E0320 07:17:36.959586 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd2462fa-d077-4466-8930-6f2e69938c1b" containerName="registry-server" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.959591 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd2462fa-d077-4466-8930-6f2e69938c1b" containerName="registry-server" Mar 20 07:17:36 crc kubenswrapper[4749]: E0320 07:17:36.959599 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0" containerName="extract-utilities" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.959604 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0" containerName="extract-utilities" Mar 20 07:17:36 crc kubenswrapper[4749]: E0320 07:17:36.959615 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a22f47dc-59ce-4cce-821c-508fc14a9508" containerName="extract-content" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.959621 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a22f47dc-59ce-4cce-821c-508fc14a9508" containerName="extract-content" Mar 20 07:17:36 crc kubenswrapper[4749]: E0320 07:17:36.959630 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="937dac41-5afa-495a-9909-1152a419549c" containerName="registry-server" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.959636 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="937dac41-5afa-495a-9909-1152a419549c" containerName="registry-server" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.959751 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5144b0-e9ee-4e0a-bd55-51d8a1cd7ad0" containerName="registry-server" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.959761 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a22f47dc-59ce-4cce-821c-508fc14a9508" containerName="registry-server" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.959770 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="937dac41-5afa-495a-9909-1152a419549c" containerName="registry-server" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.959781 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd2462fa-d077-4466-8930-6f2e69938c1b" containerName="registry-server" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.960114 4749 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.960252 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.960426 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e" gracePeriod=15 Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.960482 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6" gracePeriod=15 Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.960447 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://688e8fa067ea553fac09be724c46f16706c8b3463f09d6a4e2cfe3212027da17" gracePeriod=15 Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.960556 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc" gracePeriod=15 Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.960604 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3" gracePeriod=15 Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.961917 4749 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 07:17:36 crc kubenswrapper[4749]: E0320 07:17:36.962141 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.962154 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 07:17:36 crc kubenswrapper[4749]: E0320 07:17:36.962165 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.962172 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 07:17:36 crc kubenswrapper[4749]: E0320 07:17:36.962179 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.962186 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 07:17:36 crc kubenswrapper[4749]: E0320 07:17:36.962193 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.962199 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 07:17:36 crc kubenswrapper[4749]: E0320 07:17:36.962208 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.962217 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 07:17:36 crc kubenswrapper[4749]: E0320 07:17:36.962237 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.962245 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 07:17:36 crc kubenswrapper[4749]: E0320 07:17:36.962255 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.962261 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 07:17:36 crc kubenswrapper[4749]: E0320 07:17:36.962273 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.962296 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.962380 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.962387 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.962397 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.962406 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.962412 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.962420 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.962427 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.962434 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 07:17:36 crc kubenswrapper[4749]: E0320 07:17:36.962520 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.962529 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 07:17:36 crc kubenswrapper[4749]: E0320 07:17:36.962538 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.962545 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 07:17:36 crc kubenswrapper[4749]: I0320 07:17:36.962667 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.005753 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.089042 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.089521 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.089620 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.089656 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.089691 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.089784 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.089852 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.089919 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.191094 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.191161 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.191197 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.191255 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.191290 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.191429 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.191469 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.191558 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.191620 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.191728 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.191732 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.191788 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.191800 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.191878 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.191893 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.191949 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.239681 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.240645 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.241080 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.241480 4749 status_manager.go:851] "Failed to get status for pod" podUID="86a83215-d581-4836-852f-721e6ea3db4b" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b98466987-56rc8\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.244500 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.244900 4749 status_manager.go:851] "Failed to get status for pod" podUID="86a83215-d581-4836-852f-721e6ea3db4b" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b98466987-56rc8\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.245133 4749 status_manager.go:851] "Failed to get status for pod" podUID="d46e89a9-b781-454b-8e2d-870e92825114" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-bb5477fd6-q8ksg\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.245418 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.246332 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.292801 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86a83215-d581-4836-852f-721e6ea3db4b-proxy-ca-bundles\") pod \"86a83215-d581-4836-852f-721e6ea3db4b\" (UID: \"86a83215-d581-4836-852f-721e6ea3db4b\") " Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.292869 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86a83215-d581-4836-852f-721e6ea3db4b-config\") pod \"86a83215-d581-4836-852f-721e6ea3db4b\" (UID: \"86a83215-d581-4836-852f-721e6ea3db4b\") " Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.292893 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d46e89a9-b781-454b-8e2d-870e92825114-client-ca\") pod \"d46e89a9-b781-454b-8e2d-870e92825114\" (UID: \"d46e89a9-b781-454b-8e2d-870e92825114\") " Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.292919 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lsdv\" (UniqueName: \"kubernetes.io/projected/d46e89a9-b781-454b-8e2d-870e92825114-kube-api-access-9lsdv\") pod \"d46e89a9-b781-454b-8e2d-870e92825114\" (UID: \"d46e89a9-b781-454b-8e2d-870e92825114\") " Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.292946 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx82b\" (UniqueName: \"kubernetes.io/projected/86a83215-d581-4836-852f-721e6ea3db4b-kube-api-access-gx82b\") pod \"86a83215-d581-4836-852f-721e6ea3db4b\" (UID: \"86a83215-d581-4836-852f-721e6ea3db4b\") " Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.292962 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d46e89a9-b781-454b-8e2d-870e92825114-config\") pod \"d46e89a9-b781-454b-8e2d-870e92825114\" (UID: \"d46e89a9-b781-454b-8e2d-870e92825114\") " Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.292976 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86a83215-d581-4836-852f-721e6ea3db4b-serving-cert\") pod \"86a83215-d581-4836-852f-721e6ea3db4b\" (UID: \"86a83215-d581-4836-852f-721e6ea3db4b\") " Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.292996 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d46e89a9-b781-454b-8e2d-870e92825114-serving-cert\") pod \"d46e89a9-b781-454b-8e2d-870e92825114\" (UID: \"d46e89a9-b781-454b-8e2d-870e92825114\") " Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.293022 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86a83215-d581-4836-852f-721e6ea3db4b-client-ca\") pod \"86a83215-d581-4836-852f-721e6ea3db4b\" (UID: \"86a83215-d581-4836-852f-721e6ea3db4b\") " Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.293680 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86a83215-d581-4836-852f-721e6ea3db4b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "86a83215-d581-4836-852f-721e6ea3db4b" (UID: "86a83215-d581-4836-852f-721e6ea3db4b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.293689 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d46e89a9-b781-454b-8e2d-870e92825114-client-ca" (OuterVolumeSpecName: "client-ca") pod "d46e89a9-b781-454b-8e2d-870e92825114" (UID: "d46e89a9-b781-454b-8e2d-870e92825114"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.293908 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86a83215-d581-4836-852f-721e6ea3db4b-client-ca" (OuterVolumeSpecName: "client-ca") pod "86a83215-d581-4836-852f-721e6ea3db4b" (UID: "86a83215-d581-4836-852f-721e6ea3db4b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.293936 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86a83215-d581-4836-852f-721e6ea3db4b-config" (OuterVolumeSpecName: "config") pod "86a83215-d581-4836-852f-721e6ea3db4b" (UID: "86a83215-d581-4836-852f-721e6ea3db4b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.294392 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d46e89a9-b781-454b-8e2d-870e92825114-config" (OuterVolumeSpecName: "config") pod "d46e89a9-b781-454b-8e2d-870e92825114" (UID: "d46e89a9-b781-454b-8e2d-870e92825114"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.298057 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86a83215-d581-4836-852f-721e6ea3db4b-kube-api-access-gx82b" (OuterVolumeSpecName: "kube-api-access-gx82b") pod "86a83215-d581-4836-852f-721e6ea3db4b" (UID: "86a83215-d581-4836-852f-721e6ea3db4b"). InnerVolumeSpecName "kube-api-access-gx82b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.298298 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d46e89a9-b781-454b-8e2d-870e92825114-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d46e89a9-b781-454b-8e2d-870e92825114" (UID: "d46e89a9-b781-454b-8e2d-870e92825114"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.298303 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86a83215-d581-4836-852f-721e6ea3db4b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "86a83215-d581-4836-852f-721e6ea3db4b" (UID: "86a83215-d581-4836-852f-721e6ea3db4b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.298419 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d46e89a9-b781-454b-8e2d-870e92825114-kube-api-access-9lsdv" (OuterVolumeSpecName: "kube-api-access-9lsdv") pod "d46e89a9-b781-454b-8e2d-870e92825114" (UID: "d46e89a9-b781-454b-8e2d-870e92825114"). InnerVolumeSpecName "kube-api-access-9lsdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.300657 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 07:17:37 crc kubenswrapper[4749]: W0320 07:17:37.322983 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-da9be3cf5eb6b6be6b05a68dd94d52285f1cd6b97b304ac5e9bbddf64dd4d52d WatchSource:0}: Error finding container da9be3cf5eb6b6be6b05a68dd94d52285f1cd6b97b304ac5e9bbddf64dd4d52d: Status 404 returned error can't find the container with id da9be3cf5eb6b6be6b05a68dd94d52285f1cd6b97b304ac5e9bbddf64dd4d52d Mar 20 07:17:37 crc kubenswrapper[4749]: E0320 07:17:37.326389 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.50:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e7b77b6952e5c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:17:37.325108828 +0000 UTC m=+293.874766475,LastTimestamp:2026-03-20 07:17:37.325108828 +0000 UTC m=+293.874766475,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.383406 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.384655 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.387424 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="688e8fa067ea553fac09be724c46f16706c8b3463f09d6a4e2cfe3212027da17" exitCode=0 Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.387463 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3" exitCode=0 Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.387473 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6" exitCode=0 Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.387480 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc" exitCode=2 Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.387546 4749 scope.go:117] "RemoveContainer" containerID="f45f2b645bf6981e91289b944f37a3563f62584afe748b5423bfa81c3c71a2f8" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.389699 4749 generic.go:334] "Generic (PLEG): container finished" podID="30422da4-1696-4beb-be35-4216a61c897c" containerID="9a2ec5fd34ad5eac8d462ddc0672990a005d66e074fa6733c2b67116503f0fb6" exitCode=0 Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.389769 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"30422da4-1696-4beb-be35-4216a61c897c","Type":"ContainerDied","Data":"9a2ec5fd34ad5eac8d462ddc0672990a005d66e074fa6733c2b67116503f0fb6"} Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.390843 4749 status_manager.go:851] "Failed to get status for pod" podUID="30422da4-1696-4beb-be35-4216a61c897c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.390900 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"da9be3cf5eb6b6be6b05a68dd94d52285f1cd6b97b304ac5e9bbddf64dd4d52d"} Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.391360 4749 status_manager.go:851] "Failed to get status for pod" podUID="86a83215-d581-4836-852f-721e6ea3db4b" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b98466987-56rc8\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.391816 4749 status_manager.go:851] "Failed to get status for pod" podUID="d46e89a9-b781-454b-8e2d-870e92825114" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-bb5477fd6-q8ksg\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.392065 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.392359 4749 generic.go:334] "Generic (PLEG): container finished" podID="86a83215-d581-4836-852f-721e6ea3db4b" containerID="416da31e2aee33a1da62f4492767576e390cb15bd3b2f391a2d0a73cf3689f14" exitCode=0 Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.392421 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" event={"ID":"86a83215-d581-4836-852f-721e6ea3db4b","Type":"ContainerDied","Data":"416da31e2aee33a1da62f4492767576e390cb15bd3b2f391a2d0a73cf3689f14"} Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.392443 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.392453 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" event={"ID":"86a83215-d581-4836-852f-721e6ea3db4b","Type":"ContainerDied","Data":"b1fe5fa978d12000e7b0e8e395283f666351c9f498a0799e64fb46b4bbf8f50e"} Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.392543 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.393316 4749 status_manager.go:851] "Failed to get status for pod" podUID="d46e89a9-b781-454b-8e2d-870e92825114" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-bb5477fd6-q8ksg\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.394002 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86a83215-d581-4836-852f-721e6ea3db4b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.394022 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86a83215-d581-4836-852f-721e6ea3db4b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.394035 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86a83215-d581-4836-852f-721e6ea3db4b-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.394043 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d46e89a9-b781-454b-8e2d-870e92825114-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.394053 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lsdv\" (UniqueName: \"kubernetes.io/projected/d46e89a9-b781-454b-8e2d-870e92825114-kube-api-access-9lsdv\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.394062 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx82b\" (UniqueName: \"kubernetes.io/projected/86a83215-d581-4836-852f-721e6ea3db4b-kube-api-access-gx82b\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.394070 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d46e89a9-b781-454b-8e2d-870e92825114-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.394078 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86a83215-d581-4836-852f-721e6ea3db4b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.394085 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d46e89a9-b781-454b-8e2d-870e92825114-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.394315 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.394750 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.395042 4749 status_manager.go:851] "Failed to get status for pod" podUID="86a83215-d581-4836-852f-721e6ea3db4b" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b98466987-56rc8\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.395333 4749 status_manager.go:851] "Failed to get status for pod" podUID="30422da4-1696-4beb-be35-4216a61c897c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.396474 4749 generic.go:334] "Generic (PLEG): container finished" podID="d46e89a9-b781-454b-8e2d-870e92825114" containerID="de46bf5d8c78b17022e1b952edea175cc0584ed8eccc97d3514f470ddc796907" exitCode=0 Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.396505 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" event={"ID":"d46e89a9-b781-454b-8e2d-870e92825114","Type":"ContainerDied","Data":"de46bf5d8c78b17022e1b952edea175cc0584ed8eccc97d3514f470ddc796907"} Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.396524 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.396529 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" event={"ID":"d46e89a9-b781-454b-8e2d-870e92825114","Type":"ContainerDied","Data":"5b9e5c6248f7dfba8a14f26d35d2390b728cb4ea98c5b94b8a0b99dfd0e4aadb"} Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.397047 4749 status_manager.go:851] "Failed to get status for pod" podUID="30422da4-1696-4beb-be35-4216a61c897c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.397299 4749 status_manager.go:851] "Failed to get status for pod" podUID="86a83215-d581-4836-852f-721e6ea3db4b" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b98466987-56rc8\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.397445 4749 status_manager.go:851] "Failed to get status for pod" podUID="d46e89a9-b781-454b-8e2d-870e92825114" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-bb5477fd6-q8ksg\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.397646 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.400770 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.409790 4749 status_manager.go:851] "Failed to get status for pod" podUID="30422da4-1696-4beb-be35-4216a61c897c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.410238 4749 status_manager.go:851] "Failed to get status for pod" podUID="86a83215-d581-4836-852f-721e6ea3db4b" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b98466987-56rc8\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.410541 4749 status_manager.go:851] "Failed to get status for pod" podUID="d46e89a9-b781-454b-8e2d-870e92825114" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-bb5477fd6-q8ksg\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.410774 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.411048 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.412434 4749 status_manager.go:851] "Failed to get status for pod" podUID="d46e89a9-b781-454b-8e2d-870e92825114" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-bb5477fd6-q8ksg\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.412604 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.412804 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.413162 4749 status_manager.go:851] "Failed to get status for pod" podUID="30422da4-1696-4beb-be35-4216a61c897c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.413604 4749 status_manager.go:851] "Failed to get status for pod" podUID="86a83215-d581-4836-852f-721e6ea3db4b" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b98466987-56rc8\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.422545 4749 scope.go:117] "RemoveContainer" containerID="416da31e2aee33a1da62f4492767576e390cb15bd3b2f391a2d0a73cf3689f14" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.436172 4749 scope.go:117] "RemoveContainer" containerID="416da31e2aee33a1da62f4492767576e390cb15bd3b2f391a2d0a73cf3689f14" Mar 20 07:17:37 crc kubenswrapper[4749]: E0320 07:17:37.436649 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"416da31e2aee33a1da62f4492767576e390cb15bd3b2f391a2d0a73cf3689f14\": container with ID starting with 416da31e2aee33a1da62f4492767576e390cb15bd3b2f391a2d0a73cf3689f14 not found: ID does not exist" containerID="416da31e2aee33a1da62f4492767576e390cb15bd3b2f391a2d0a73cf3689f14" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.436704 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"416da31e2aee33a1da62f4492767576e390cb15bd3b2f391a2d0a73cf3689f14"} err="failed to get container status \"416da31e2aee33a1da62f4492767576e390cb15bd3b2f391a2d0a73cf3689f14\": rpc error: code = NotFound desc = could not find container \"416da31e2aee33a1da62f4492767576e390cb15bd3b2f391a2d0a73cf3689f14\": container with ID starting with 416da31e2aee33a1da62f4492767576e390cb15bd3b2f391a2d0a73cf3689f14 not found: ID does not exist" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.436739 4749 scope.go:117] "RemoveContainer" containerID="de46bf5d8c78b17022e1b952edea175cc0584ed8eccc97d3514f470ddc796907" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.456409 4749 scope.go:117] "RemoveContainer" containerID="de46bf5d8c78b17022e1b952edea175cc0584ed8eccc97d3514f470ddc796907" Mar 20 07:17:37 crc kubenswrapper[4749]: E0320 07:17:37.457711 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de46bf5d8c78b17022e1b952edea175cc0584ed8eccc97d3514f470ddc796907\": container with ID starting with de46bf5d8c78b17022e1b952edea175cc0584ed8eccc97d3514f470ddc796907 not found: ID does not exist" containerID="de46bf5d8c78b17022e1b952edea175cc0584ed8eccc97d3514f470ddc796907" Mar 20 07:17:37 crc kubenswrapper[4749]: I0320 07:17:37.457784 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de46bf5d8c78b17022e1b952edea175cc0584ed8eccc97d3514f470ddc796907"} err="failed to get container status \"de46bf5d8c78b17022e1b952edea175cc0584ed8eccc97d3514f470ddc796907\": rpc error: code = NotFound desc = could not find container \"de46bf5d8c78b17022e1b952edea175cc0584ed8eccc97d3514f470ddc796907\": container with ID starting with de46bf5d8c78b17022e1b952edea175cc0584ed8eccc97d3514f470ddc796907 not found: ID does not exist" Mar 20 07:17:38 crc kubenswrapper[4749]: E0320 07:17:38.106964 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:38 crc kubenswrapper[4749]: E0320 07:17:38.107807 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:38 crc kubenswrapper[4749]: E0320 07:17:38.108676 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:38 crc kubenswrapper[4749]: E0320 07:17:38.109519 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:38 crc kubenswrapper[4749]: E0320 07:17:38.110305 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:38 crc kubenswrapper[4749]: I0320 07:17:38.110362 4749 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 07:17:38 crc kubenswrapper[4749]: E0320 07:17:38.110768 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="200ms" Mar 20 07:17:38 crc kubenswrapper[4749]: E0320 07:17:38.312268 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="400ms" Mar 20 07:17:38 crc kubenswrapper[4749]: I0320 07:17:38.408810 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 07:17:38 crc kubenswrapper[4749]: I0320 07:17:38.411446 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5f79f11fe5f1911b3210b36c9a630f224a7c92db0f2ba3a961bdb7d93f736d32"} Mar 20 07:17:38 crc kubenswrapper[4749]: I0320 07:17:38.412085 4749 status_manager.go:851] "Failed to get status for pod" podUID="d46e89a9-b781-454b-8e2d-870e92825114" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-bb5477fd6-q8ksg\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:38 crc kubenswrapper[4749]: I0320 07:17:38.412469 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:38 crc kubenswrapper[4749]: I0320 07:17:38.412792 4749 status_manager.go:851] "Failed to get status for pod" podUID="30422da4-1696-4beb-be35-4216a61c897c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:38 crc kubenswrapper[4749]: I0320 07:17:38.413258 4749 status_manager.go:851] "Failed to get status for pod" podUID="86a83215-d581-4836-852f-721e6ea3db4b" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b98466987-56rc8\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:38 crc kubenswrapper[4749]: E0320 07:17:38.714047 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="800ms" Mar 20 07:17:38 crc kubenswrapper[4749]: I0320 07:17:38.772319 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 07:17:38 crc kubenswrapper[4749]: I0320 07:17:38.773210 4749 status_manager.go:851] "Failed to get status for pod" podUID="30422da4-1696-4beb-be35-4216a61c897c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:38 crc kubenswrapper[4749]: I0320 07:17:38.773930 4749 status_manager.go:851] "Failed to get status for pod" podUID="86a83215-d581-4836-852f-721e6ea3db4b" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b98466987-56rc8\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:38 crc kubenswrapper[4749]: I0320 07:17:38.774533 4749 status_manager.go:851] "Failed to get status for pod" podUID="d46e89a9-b781-454b-8e2d-870e92825114" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-bb5477fd6-q8ksg\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:38 crc kubenswrapper[4749]: I0320 07:17:38.775196 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:38 crc kubenswrapper[4749]: I0320 07:17:38.910989 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/30422da4-1696-4beb-be35-4216a61c897c-kubelet-dir\") pod \"30422da4-1696-4beb-be35-4216a61c897c\" (UID: \"30422da4-1696-4beb-be35-4216a61c897c\") " Mar 20 07:17:38 crc kubenswrapper[4749]: I0320 07:17:38.911071 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30422da4-1696-4beb-be35-4216a61c897c-kube-api-access\") pod \"30422da4-1696-4beb-be35-4216a61c897c\" (UID: \"30422da4-1696-4beb-be35-4216a61c897c\") " Mar 20 07:17:38 crc kubenswrapper[4749]: I0320 07:17:38.911127 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/30422da4-1696-4beb-be35-4216a61c897c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "30422da4-1696-4beb-be35-4216a61c897c" (UID: "30422da4-1696-4beb-be35-4216a61c897c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:17:38 crc kubenswrapper[4749]: I0320 07:17:38.911164 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/30422da4-1696-4beb-be35-4216a61c897c-var-lock\") pod \"30422da4-1696-4beb-be35-4216a61c897c\" (UID: \"30422da4-1696-4beb-be35-4216a61c897c\") " Mar 20 07:17:38 crc kubenswrapper[4749]: I0320 07:17:38.911426 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/30422da4-1696-4beb-be35-4216a61c897c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:38 crc kubenswrapper[4749]: I0320 07:17:38.911477 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/30422da4-1696-4beb-be35-4216a61c897c-var-lock" (OuterVolumeSpecName: "var-lock") pod "30422da4-1696-4beb-be35-4216a61c897c" (UID: "30422da4-1696-4beb-be35-4216a61c897c"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:17:38 crc kubenswrapper[4749]: I0320 07:17:38.918859 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30422da4-1696-4beb-be35-4216a61c897c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "30422da4-1696-4beb-be35-4216a61c897c" (UID: "30422da4-1696-4beb-be35-4216a61c897c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.012341 4749 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/30422da4-1696-4beb-be35-4216a61c897c-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.012371 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30422da4-1696-4beb-be35-4216a61c897c-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.424207 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.427203 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e" exitCode=0 Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.427346 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ccd8959d5986d59b545f33443d0226f113dd6eee7ea34c34e45eb36ba5aa9bc" Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.430630 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.430696 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"30422da4-1696-4beb-be35-4216a61c897c","Type":"ContainerDied","Data":"66e94799422370bc83f0e8c9d0235ba6dedd9dd7b626f993c57d3a92b8825eda"} Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.430744 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66e94799422370bc83f0e8c9d0235ba6dedd9dd7b626f993c57d3a92b8825eda" Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.442060 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.442700 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.443480 4749 status_manager.go:851] "Failed to get status for pod" podUID="30422da4-1696-4beb-be35-4216a61c897c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.444002 4749 status_manager.go:851] "Failed to get status for pod" podUID="86a83215-d581-4836-852f-721e6ea3db4b" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b98466987-56rc8\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.444493 4749 status_manager.go:851] "Failed to get status for pod" podUID="d46e89a9-b781-454b-8e2d-870e92825114" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-bb5477fd6-q8ksg\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.444794 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.444987 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.445649 4749 status_manager.go:851] "Failed to get status for pod" podUID="d46e89a9-b781-454b-8e2d-870e92825114" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-bb5477fd6-q8ksg\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.446248 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.446948 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.447180 4749 status_manager.go:851] "Failed to get status for pod" podUID="30422da4-1696-4beb-be35-4216a61c897c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.447916 4749 status_manager.go:851] "Failed to get status for pod" podUID="86a83215-d581-4836-852f-721e6ea3db4b" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b98466987-56rc8\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:39 crc kubenswrapper[4749]: E0320 07:17:39.515398 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="1.6s" Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.518260 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.518334 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.518340 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.518377 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.518421 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.518524 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.518786 4749 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.518803 4749 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:39 crc kubenswrapper[4749]: I0320 07:17:39.518811 4749 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:40 crc kubenswrapper[4749]: I0320 07:17:40.193434 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 07:17:40 crc kubenswrapper[4749]: I0320 07:17:40.437699 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:17:40 crc kubenswrapper[4749]: I0320 07:17:40.439808 4749 status_manager.go:851] "Failed to get status for pod" podUID="30422da4-1696-4beb-be35-4216a61c897c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:40 crc kubenswrapper[4749]: I0320 07:17:40.440386 4749 status_manager.go:851] "Failed to get status for pod" podUID="86a83215-d581-4836-852f-721e6ea3db4b" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b98466987-56rc8\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:40 crc kubenswrapper[4749]: I0320 07:17:40.440625 4749 status_manager.go:851] "Failed to get status for pod" podUID="d46e89a9-b781-454b-8e2d-870e92825114" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-bb5477fd6-q8ksg\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:40 crc kubenswrapper[4749]: I0320 07:17:40.441010 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:40 crc kubenswrapper[4749]: I0320 07:17:40.442399 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:40 crc kubenswrapper[4749]: I0320 07:17:40.444516 4749 status_manager.go:851] "Failed to get status for pod" podUID="30422da4-1696-4beb-be35-4216a61c897c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:40 crc kubenswrapper[4749]: I0320 07:17:40.444998 4749 status_manager.go:851] "Failed to get status for pod" podUID="86a83215-d581-4836-852f-721e6ea3db4b" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b98466987-56rc8\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:40 crc kubenswrapper[4749]: I0320 07:17:40.445305 4749 status_manager.go:851] "Failed to get status for pod" podUID="d46e89a9-b781-454b-8e2d-870e92825114" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-bb5477fd6-q8ksg\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:40 crc kubenswrapper[4749]: I0320 07:17:40.445614 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:40 crc kubenswrapper[4749]: I0320 07:17:40.446128 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:41 crc kubenswrapper[4749]: E0320 07:17:41.116460 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="3.2s" Mar 20 07:17:42 crc kubenswrapper[4749]: E0320 07:17:42.922104 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.50:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e7b77b6952e5c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 07:17:37.325108828 +0000 UTC m=+293.874766475,LastTimestamp:2026-03-20 07:17:37.325108828 +0000 UTC m=+293.874766475,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.113443 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" podUID="38b3f23d-6db5-4788-bcd5-810450677cd6" containerName="oauth-openshift" containerID="cri-o://9b4290553fefaf6bd1dc9d71cbd1a4957939dd415b4419fa033eb797173a8cb2" gracePeriod=15 Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.179017 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.179756 4749 status_manager.go:851] "Failed to get status for pod" podUID="30422da4-1696-4beb-be35-4216a61c897c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.180107 4749 status_manager.go:851] "Failed to get status for pod" podUID="86a83215-d581-4836-852f-721e6ea3db4b" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b98466987-56rc8\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.180620 4749 status_manager.go:851] "Failed to get status for pod" podUID="d46e89a9-b781-454b-8e2d-870e92825114" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-bb5477fd6-q8ksg\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:44 crc kubenswrapper[4749]: E0320 07:17:44.317774 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="6.4s" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.467931 4749 generic.go:334] "Generic (PLEG): container finished" podID="38b3f23d-6db5-4788-bcd5-810450677cd6" containerID="9b4290553fefaf6bd1dc9d71cbd1a4957939dd415b4419fa033eb797173a8cb2" exitCode=0 Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.468134 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" event={"ID":"38b3f23d-6db5-4788-bcd5-810450677cd6","Type":"ContainerDied","Data":"9b4290553fefaf6bd1dc9d71cbd1a4957939dd415b4419fa033eb797173a8cb2"} Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.505144 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.505655 4749 status_manager.go:851] "Failed to get status for pod" podUID="d46e89a9-b781-454b-8e2d-870e92825114" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-bb5477fd6-q8ksg\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.506097 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.506378 4749 status_manager.go:851] "Failed to get status for pod" podUID="38b3f23d-6db5-4788-bcd5-810450677cd6" pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-t5b5l\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.506724 4749 status_manager.go:851] "Failed to get status for pod" podUID="30422da4-1696-4beb-be35-4216a61c897c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.507608 4749 status_manager.go:851] "Failed to get status for pod" podUID="86a83215-d581-4836-852f-721e6ea3db4b" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b98466987-56rc8\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.679256 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-user-template-error\") pod \"38b3f23d-6db5-4788-bcd5-810450677cd6\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.679324 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-user-template-login\") pod \"38b3f23d-6db5-4788-bcd5-810450677cd6\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.679353 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/38b3f23d-6db5-4788-bcd5-810450677cd6-audit-dir\") pod \"38b3f23d-6db5-4788-bcd5-810450677cd6\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.679379 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-service-ca\") pod \"38b3f23d-6db5-4788-bcd5-810450677cd6\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.679402 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-serving-cert\") pod \"38b3f23d-6db5-4788-bcd5-810450677cd6\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.679433 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-cliconfig\") pod \"38b3f23d-6db5-4788-bcd5-810450677cd6\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.679457 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-ocp-branding-template\") pod \"38b3f23d-6db5-4788-bcd5-810450677cd6\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.679487 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38b3f23d-6db5-4788-bcd5-810450677cd6-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "38b3f23d-6db5-4788-bcd5-810450677cd6" (UID: "38b3f23d-6db5-4788-bcd5-810450677cd6"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.679520 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gznf6\" (UniqueName: \"kubernetes.io/projected/38b3f23d-6db5-4788-bcd5-810450677cd6-kube-api-access-gznf6\") pod \"38b3f23d-6db5-4788-bcd5-810450677cd6\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.679549 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-user-idp-0-file-data\") pod \"38b3f23d-6db5-4788-bcd5-810450677cd6\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.679569 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-router-certs\") pod \"38b3f23d-6db5-4788-bcd5-810450677cd6\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.679595 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-trusted-ca-bundle\") pod \"38b3f23d-6db5-4788-bcd5-810450677cd6\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.679617 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-session\") pod \"38b3f23d-6db5-4788-bcd5-810450677cd6\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.679660 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-user-template-provider-selection\") pod \"38b3f23d-6db5-4788-bcd5-810450677cd6\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.679692 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/38b3f23d-6db5-4788-bcd5-810450677cd6-audit-policies\") pod \"38b3f23d-6db5-4788-bcd5-810450677cd6\" (UID: \"38b3f23d-6db5-4788-bcd5-810450677cd6\") " Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.679911 4749 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/38b3f23d-6db5-4788-bcd5-810450677cd6-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.680334 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38b3f23d-6db5-4788-bcd5-810450677cd6-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "38b3f23d-6db5-4788-bcd5-810450677cd6" (UID: "38b3f23d-6db5-4788-bcd5-810450677cd6"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.680352 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "38b3f23d-6db5-4788-bcd5-810450677cd6" (UID: "38b3f23d-6db5-4788-bcd5-810450677cd6"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.681444 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "38b3f23d-6db5-4788-bcd5-810450677cd6" (UID: "38b3f23d-6db5-4788-bcd5-810450677cd6"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.681439 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "38b3f23d-6db5-4788-bcd5-810450677cd6" (UID: "38b3f23d-6db5-4788-bcd5-810450677cd6"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.686306 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38b3f23d-6db5-4788-bcd5-810450677cd6-kube-api-access-gznf6" (OuterVolumeSpecName: "kube-api-access-gznf6") pod "38b3f23d-6db5-4788-bcd5-810450677cd6" (UID: "38b3f23d-6db5-4788-bcd5-810450677cd6"). InnerVolumeSpecName "kube-api-access-gznf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.687583 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "38b3f23d-6db5-4788-bcd5-810450677cd6" (UID: "38b3f23d-6db5-4788-bcd5-810450677cd6"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.687946 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "38b3f23d-6db5-4788-bcd5-810450677cd6" (UID: "38b3f23d-6db5-4788-bcd5-810450677cd6"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.688455 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "38b3f23d-6db5-4788-bcd5-810450677cd6" (UID: "38b3f23d-6db5-4788-bcd5-810450677cd6"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.688927 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "38b3f23d-6db5-4788-bcd5-810450677cd6" (UID: "38b3f23d-6db5-4788-bcd5-810450677cd6"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.689117 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "38b3f23d-6db5-4788-bcd5-810450677cd6" (UID: "38b3f23d-6db5-4788-bcd5-810450677cd6"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.689405 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "38b3f23d-6db5-4788-bcd5-810450677cd6" (UID: "38b3f23d-6db5-4788-bcd5-810450677cd6"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.690433 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "38b3f23d-6db5-4788-bcd5-810450677cd6" (UID: "38b3f23d-6db5-4788-bcd5-810450677cd6"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.690916 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "38b3f23d-6db5-4788-bcd5-810450677cd6" (UID: "38b3f23d-6db5-4788-bcd5-810450677cd6"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.780847 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.780901 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.780923 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.780945 4749 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/38b3f23d-6db5-4788-bcd5-810450677cd6-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.780964 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.780982 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.781000 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.781017 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.781036 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.781054 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.781073 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gznf6\" (UniqueName: \"kubernetes.io/projected/38b3f23d-6db5-4788-bcd5-810450677cd6-kube-api-access-gznf6\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.781090 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:44 crc kubenswrapper[4749]: I0320 07:17:44.781108 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/38b3f23d-6db5-4788-bcd5-810450677cd6-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 07:17:45 crc kubenswrapper[4749]: I0320 07:17:45.477054 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" event={"ID":"38b3f23d-6db5-4788-bcd5-810450677cd6","Type":"ContainerDied","Data":"f49ba121379c7d5198c293dcb95afaaf4bb879014adafa6923fcf7f2c9f0066a"} Mar 20 07:17:45 crc kubenswrapper[4749]: I0320 07:17:45.477117 4749 scope.go:117] "RemoveContainer" containerID="9b4290553fefaf6bd1dc9d71cbd1a4957939dd415b4419fa033eb797173a8cb2" Mar 20 07:17:45 crc kubenswrapper[4749]: I0320 07:17:45.477143 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" Mar 20 07:17:45 crc kubenswrapper[4749]: I0320 07:17:45.478321 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:45 crc kubenswrapper[4749]: I0320 07:17:45.478899 4749 status_manager.go:851] "Failed to get status for pod" podUID="30422da4-1696-4beb-be35-4216a61c897c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:45 crc kubenswrapper[4749]: I0320 07:17:45.480009 4749 status_manager.go:851] "Failed to get status for pod" podUID="86a83215-d581-4836-852f-721e6ea3db4b" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b98466987-56rc8\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:45 crc kubenswrapper[4749]: I0320 07:17:45.480399 4749 status_manager.go:851] "Failed to get status for pod" podUID="38b3f23d-6db5-4788-bcd5-810450677cd6" pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-t5b5l\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:45 crc kubenswrapper[4749]: I0320 07:17:45.480934 4749 status_manager.go:851] "Failed to get status for pod" podUID="d46e89a9-b781-454b-8e2d-870e92825114" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-bb5477fd6-q8ksg\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:45 crc kubenswrapper[4749]: I0320 07:17:45.502555 4749 status_manager.go:851] "Failed to get status for pod" podUID="30422da4-1696-4beb-be35-4216a61c897c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:45 crc kubenswrapper[4749]: I0320 07:17:45.503105 4749 status_manager.go:851] "Failed to get status for pod" podUID="86a83215-d581-4836-852f-721e6ea3db4b" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b98466987-56rc8\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:45 crc kubenswrapper[4749]: I0320 07:17:45.503436 4749 status_manager.go:851] "Failed to get status for pod" podUID="38b3f23d-6db5-4788-bcd5-810450677cd6" pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-t5b5l\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:45 crc kubenswrapper[4749]: I0320 07:17:45.503744 4749 status_manager.go:851] "Failed to get status for pod" podUID="d46e89a9-b781-454b-8e2d-870e92825114" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-bb5477fd6-q8ksg\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:45 crc kubenswrapper[4749]: I0320 07:17:45.504138 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.177166 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.180373 4749 status_manager.go:851] "Failed to get status for pod" podUID="86a83215-d581-4836-852f-721e6ea3db4b" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b98466987-56rc8\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.181369 4749 status_manager.go:851] "Failed to get status for pod" podUID="38b3f23d-6db5-4788-bcd5-810450677cd6" pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-t5b5l\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.182058 4749 status_manager.go:851] "Failed to get status for pod" podUID="30422da4-1696-4beb-be35-4216a61c897c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.182613 4749 status_manager.go:851] "Failed to get status for pod" podUID="d46e89a9-b781-454b-8e2d-870e92825114" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-bb5477fd6-q8ksg\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.183194 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.193879 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d36aabe4-f4b7-4552-848b-0c22f7ac4753" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.194101 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d36aabe4-f4b7-4552-848b-0c22f7ac4753" Mar 20 07:17:50 crc kubenswrapper[4749]: E0320 07:17:50.195016 4749 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.195730 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:17:50 crc kubenswrapper[4749]: W0320 07:17:50.227851 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-91269065357c7bf48bff326358d1d0efaa351fc3b15ef0a2372c526153f82399 WatchSource:0}: Error finding container 91269065357c7bf48bff326358d1d0efaa351fc3b15ef0a2372c526153f82399: Status 404 returned error can't find the container with id 91269065357c7bf48bff326358d1d0efaa351fc3b15ef0a2372c526153f82399 Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.513550 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.514379 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.514460 4749 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="1ee1edb21f5116ef152f5808824f0529ac3bc52a3959df7a21c031da45b5284a" exitCode=1 Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.514567 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"1ee1edb21f5116ef152f5808824f0529ac3bc52a3959df7a21c031da45b5284a"} Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.515160 4749 scope.go:117] "RemoveContainer" containerID="1ee1edb21f5116ef152f5808824f0529ac3bc52a3959df7a21c031da45b5284a" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.515492 4749 status_manager.go:851] "Failed to get status for pod" podUID="30422da4-1696-4beb-be35-4216a61c897c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.515799 4749 status_manager.go:851] "Failed to get status for pod" podUID="86a83215-d581-4836-852f-721e6ea3db4b" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b98466987-56rc8\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.516496 4749 status_manager.go:851] "Failed to get status for pod" podUID="38b3f23d-6db5-4788-bcd5-810450677cd6" pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-t5b5l\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.516975 4749 status_manager.go:851] "Failed to get status for pod" podUID="d46e89a9-b781-454b-8e2d-870e92825114" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-bb5477fd6-q8ksg\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.517315 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2d03b0f9ef34f14ae84b899d1c6ce9e95e1f8209352417e6fe2a79871461238a"} Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.517350 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"91269065357c7bf48bff326358d1d0efaa351fc3b15ef0a2372c526153f82399"} Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.517358 4749 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.517609 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d36aabe4-f4b7-4552-848b-0c22f7ac4753" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.517629 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d36aabe4-f4b7-4552-848b-0c22f7ac4753" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.517965 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:50 crc kubenswrapper[4749]: E0320 07:17:50.517992 4749 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.518812 4749 status_manager.go:851] "Failed to get status for pod" podUID="d46e89a9-b781-454b-8e2d-870e92825114" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-bb5477fd6-q8ksg\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.519150 4749 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.519545 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.520091 4749 status_manager.go:851] "Failed to get status for pod" podUID="30422da4-1696-4beb-be35-4216a61c897c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.520542 4749 status_manager.go:851] "Failed to get status for pod" podUID="86a83215-d581-4836-852f-721e6ea3db4b" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b98466987-56rc8\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.520997 4749 status_manager.go:851] "Failed to get status for pod" podUID="38b3f23d-6db5-4788-bcd5-810450677cd6" pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-t5b5l\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:50 crc kubenswrapper[4749]: I0320 07:17:50.539155 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 07:17:50 crc kubenswrapper[4749]: E0320 07:17:50.718745 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.50:6443: connect: connection refused" interval="7s" Mar 20 07:17:51 crc kubenswrapper[4749]: I0320 07:17:51.523723 4749 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="2d03b0f9ef34f14ae84b899d1c6ce9e95e1f8209352417e6fe2a79871461238a" exitCode=0 Mar 20 07:17:51 crc kubenswrapper[4749]: I0320 07:17:51.523835 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"2d03b0f9ef34f14ae84b899d1c6ce9e95e1f8209352417e6fe2a79871461238a"} Mar 20 07:17:51 crc kubenswrapper[4749]: I0320 07:17:51.524218 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d36aabe4-f4b7-4552-848b-0c22f7ac4753" Mar 20 07:17:51 crc kubenswrapper[4749]: I0320 07:17:51.524236 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d36aabe4-f4b7-4552-848b-0c22f7ac4753" Mar 20 07:17:51 crc kubenswrapper[4749]: E0320 07:17:51.524728 4749 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:17:51 crc kubenswrapper[4749]: I0320 07:17:51.524952 4749 status_manager.go:851] "Failed to get status for pod" podUID="d46e89a9-b781-454b-8e2d-870e92825114" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-bb5477fd6-q8ksg\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:51 crc kubenswrapper[4749]: I0320 07:17:51.525695 4749 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:51 crc kubenswrapper[4749]: I0320 07:17:51.526133 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:51 crc kubenswrapper[4749]: I0320 07:17:51.526870 4749 status_manager.go:851] "Failed to get status for pod" podUID="38b3f23d-6db5-4788-bcd5-810450677cd6" pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-t5b5l\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:51 crc kubenswrapper[4749]: I0320 07:17:51.527924 4749 status_manager.go:851] "Failed to get status for pod" podUID="30422da4-1696-4beb-be35-4216a61c897c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:51 crc kubenswrapper[4749]: I0320 07:17:51.528191 4749 status_manager.go:851] "Failed to get status for pod" podUID="86a83215-d581-4836-852f-721e6ea3db4b" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b98466987-56rc8\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:51 crc kubenswrapper[4749]: I0320 07:17:51.529198 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 07:17:51 crc kubenswrapper[4749]: I0320 07:17:51.529973 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 07:17:51 crc kubenswrapper[4749]: I0320 07:17:51.530075 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"68430e9e5e0b03782542c5ec3bf0f83873ebd1bcf6e72f2377978ecf6153ecb9"} Mar 20 07:17:51 crc kubenswrapper[4749]: I0320 07:17:51.530851 4749 status_manager.go:851] "Failed to get status for pod" podUID="d46e89a9-b781-454b-8e2d-870e92825114" pod="openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-bb5477fd6-q8ksg\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:51 crc kubenswrapper[4749]: I0320 07:17:51.531311 4749 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:51 crc kubenswrapper[4749]: I0320 07:17:51.531588 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:51 crc kubenswrapper[4749]: I0320 07:17:51.532046 4749 status_manager.go:851] "Failed to get status for pod" podUID="38b3f23d-6db5-4788-bcd5-810450677cd6" pod="openshift-authentication/oauth-openshift-558db77b4-t5b5l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-t5b5l\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:51 crc kubenswrapper[4749]: I0320 07:17:51.532451 4749 status_manager.go:851] "Failed to get status for pod" podUID="30422da4-1696-4beb-be35-4216a61c897c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:51 crc kubenswrapper[4749]: I0320 07:17:51.532878 4749 status_manager.go:851] "Failed to get status for pod" podUID="86a83215-d581-4836-852f-721e6ea3db4b" pod="openshift-controller-manager/controller-manager-5b98466987-56rc8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b98466987-56rc8\": dial tcp 38.102.83.50:6443: connect: connection refused" Mar 20 07:17:52 crc kubenswrapper[4749]: I0320 07:17:52.536434 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d61a005fca1f4b4897462d683fec21acd2d78f822030484e0e717e5712da9597"} Mar 20 07:17:52 crc kubenswrapper[4749]: I0320 07:17:52.536670 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"def8dfc8032de7c5a5b1be6dbdf98f6524be05b62c6af9ba15637ef73b687667"} Mar 20 07:17:52 crc kubenswrapper[4749]: I0320 07:17:52.536679 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e657bfeb3d9dbc8f54227a614e8494d39e79df64dee3194e6918b7bd99217552"} Mar 20 07:17:53 crc kubenswrapper[4749]: I0320 07:17:53.548656 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b05040274d24d162f029d557cb7f63cb9a606e60555e90a27543bd9ccf4b9ff5"} Mar 20 07:17:53 crc kubenswrapper[4749]: I0320 07:17:53.548705 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c8f6f3c579ef4234d62d062df5f006c57038231db397c70b6021e05f313b9fa2"} Mar 20 07:17:53 crc kubenswrapper[4749]: I0320 07:17:53.548958 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:17:53 crc kubenswrapper[4749]: I0320 07:17:53.549164 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d36aabe4-f4b7-4552-848b-0c22f7ac4753" Mar 20 07:17:53 crc kubenswrapper[4749]: I0320 07:17:53.549198 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d36aabe4-f4b7-4552-848b-0c22f7ac4753" Mar 20 07:17:55 crc kubenswrapper[4749]: I0320 07:17:55.196416 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:17:55 crc kubenswrapper[4749]: I0320 07:17:55.196817 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:17:55 crc kubenswrapper[4749]: I0320 07:17:55.204193 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:17:58 crc kubenswrapper[4749]: I0320 07:17:58.023617 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 07:17:58 crc kubenswrapper[4749]: I0320 07:17:58.031863 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 07:17:58 crc kubenswrapper[4749]: I0320 07:17:58.560684 4749 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:17:58 crc kubenswrapper[4749]: I0320 07:17:58.580023 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d36aabe4-f4b7-4552-848b-0c22f7ac4753" Mar 20 07:17:58 crc kubenswrapper[4749]: I0320 07:17:58.580053 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d36aabe4-f4b7-4552-848b-0c22f7ac4753" Mar 20 07:17:58 crc kubenswrapper[4749]: I0320 07:17:58.580176 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 07:17:58 crc kubenswrapper[4749]: I0320 07:17:58.585367 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:17:58 crc kubenswrapper[4749]: I0320 07:17:58.775227 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="985feb99-0175-41dc-bec4-d67c7719da17" Mar 20 07:17:59 crc kubenswrapper[4749]: I0320 07:17:59.585136 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d36aabe4-f4b7-4552-848b-0c22f7ac4753" Mar 20 07:17:59 crc kubenswrapper[4749]: I0320 07:17:59.585501 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d36aabe4-f4b7-4552-848b-0c22f7ac4753" Mar 20 07:17:59 crc kubenswrapper[4749]: I0320 07:17:59.588594 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="985feb99-0175-41dc-bec4-d67c7719da17" Mar 20 07:18:00 crc kubenswrapper[4749]: I0320 07:18:00.543187 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 07:18:08 crc kubenswrapper[4749]: I0320 07:18:08.091452 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 07:18:08 crc kubenswrapper[4749]: I0320 07:18:08.784474 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 07:18:08 crc kubenswrapper[4749]: I0320 07:18:08.828482 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 07:18:08 crc kubenswrapper[4749]: I0320 07:18:08.883075 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 07:18:09 crc kubenswrapper[4749]: I0320 07:18:09.066282 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 07:18:09 crc kubenswrapper[4749]: I0320 07:18:09.192333 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 07:18:09 crc kubenswrapper[4749]: I0320 07:18:09.249684 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 07:18:09 crc kubenswrapper[4749]: I0320 07:18:09.385923 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 07:18:09 crc kubenswrapper[4749]: I0320 07:18:09.443575 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 07:18:09 crc kubenswrapper[4749]: I0320 07:18:09.502739 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 07:18:09 crc kubenswrapper[4749]: I0320 07:18:09.669686 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 07:18:09 crc kubenswrapper[4749]: I0320 07:18:09.820441 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 07:18:10 crc kubenswrapper[4749]: I0320 07:18:10.112905 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 07:18:10 crc kubenswrapper[4749]: I0320 07:18:10.162071 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 07:18:10 crc kubenswrapper[4749]: I0320 07:18:10.224223 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 07:18:10 crc kubenswrapper[4749]: I0320 07:18:10.281854 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 07:18:10 crc kubenswrapper[4749]: I0320 07:18:10.482747 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 07:18:10 crc kubenswrapper[4749]: I0320 07:18:10.496409 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 07:18:10 crc kubenswrapper[4749]: I0320 07:18:10.693107 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 07:18:10 crc kubenswrapper[4749]: I0320 07:18:10.973815 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 07:18:11 crc kubenswrapper[4749]: I0320 07:18:11.067855 4749 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 07:18:11 crc kubenswrapper[4749]: I0320 07:18:11.078856 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 07:18:11 crc kubenswrapper[4749]: I0320 07:18:11.127797 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 07:18:11 crc kubenswrapper[4749]: I0320 07:18:11.164046 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 07:18:11 crc kubenswrapper[4749]: I0320 07:18:11.513014 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 07:18:11 crc kubenswrapper[4749]: I0320 07:18:11.554649 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 07:18:11 crc kubenswrapper[4749]: I0320 07:18:11.580869 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 07:18:11 crc kubenswrapper[4749]: I0320 07:18:11.637820 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 07:18:11 crc kubenswrapper[4749]: I0320 07:18:11.760709 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 07:18:11 crc kubenswrapper[4749]: I0320 07:18:11.769907 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 07:18:11 crc kubenswrapper[4749]: I0320 07:18:11.780131 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 07:18:11 crc kubenswrapper[4749]: I0320 07:18:11.783940 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 07:18:11 crc kubenswrapper[4749]: I0320 07:18:11.877792 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 07:18:12 crc kubenswrapper[4749]: I0320 07:18:12.001993 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 07:18:12 crc kubenswrapper[4749]: I0320 07:18:12.074864 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 07:18:12 crc kubenswrapper[4749]: I0320 07:18:12.127094 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 07:18:12 crc kubenswrapper[4749]: I0320 07:18:12.154654 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 07:18:12 crc kubenswrapper[4749]: I0320 07:18:12.347224 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 07:18:12 crc kubenswrapper[4749]: I0320 07:18:12.413790 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 07:18:12 crc kubenswrapper[4749]: I0320 07:18:12.453857 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 07:18:12 crc kubenswrapper[4749]: I0320 07:18:12.495359 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 07:18:12 crc kubenswrapper[4749]: I0320 07:18:12.579152 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 07:18:12 crc kubenswrapper[4749]: I0320 07:18:12.601434 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 07:18:12 crc kubenswrapper[4749]: I0320 07:18:12.614376 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 07:18:12 crc kubenswrapper[4749]: I0320 07:18:12.653883 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 07:18:12 crc kubenswrapper[4749]: I0320 07:18:12.690909 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 07:18:12 crc kubenswrapper[4749]: I0320 07:18:12.787184 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 07:18:12 crc kubenswrapper[4749]: I0320 07:18:12.854065 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.001477 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.080849 4749 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.083228 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=37.083211672 podStartE2EDuration="37.083211672s" podCreationTimestamp="2026-03-20 07:17:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:17:58.644477759 +0000 UTC m=+315.194135396" watchObservedRunningTime="2026-03-20 07:18:13.083211672 +0000 UTC m=+329.632869319" Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.085492 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-t5b5l","openshift-controller-manager/controller-manager-5b98466987-56rc8","openshift-route-controller-manager/route-controller-manager-bb5477fd6-q8ksg","openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.085561 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.087013 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.093408 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.115824 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=15.115802689 podStartE2EDuration="15.115802689s" podCreationTimestamp="2026-03-20 07:17:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:18:13.114834666 +0000 UTC m=+329.664492333" watchObservedRunningTime="2026-03-20 07:18:13.115802689 +0000 UTC m=+329.665460346" Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.152859 4749 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.254479 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.256597 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.275381 4749 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.291241 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.362028 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.363899 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.426933 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.477187 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.622259 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.629244 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.636823 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.684824 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.721551 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.759617 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.842435 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 07:18:13 crc kubenswrapper[4749]: I0320 07:18:13.981986 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 07:18:14 crc kubenswrapper[4749]: I0320 07:18:14.013250 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 07:18:14 crc kubenswrapper[4749]: I0320 07:18:14.066609 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 07:18:14 crc kubenswrapper[4749]: I0320 07:18:14.089584 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 07:18:14 crc kubenswrapper[4749]: I0320 07:18:14.091202 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 07:18:14 crc kubenswrapper[4749]: I0320 07:18:14.095041 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 07:18:14 crc kubenswrapper[4749]: I0320 07:18:14.130763 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 07:18:14 crc kubenswrapper[4749]: I0320 07:18:14.134536 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 07:18:14 crc kubenswrapper[4749]: I0320 07:18:14.186050 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38b3f23d-6db5-4788-bcd5-810450677cd6" path="/var/lib/kubelet/pods/38b3f23d-6db5-4788-bcd5-810450677cd6/volumes" Mar 20 07:18:14 crc kubenswrapper[4749]: I0320 07:18:14.187487 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86a83215-d581-4836-852f-721e6ea3db4b" path="/var/lib/kubelet/pods/86a83215-d581-4836-852f-721e6ea3db4b/volumes" Mar 20 07:18:14 crc kubenswrapper[4749]: I0320 07:18:14.188673 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d46e89a9-b781-454b-8e2d-870e92825114" path="/var/lib/kubelet/pods/d46e89a9-b781-454b-8e2d-870e92825114/volumes" Mar 20 07:18:14 crc kubenswrapper[4749]: I0320 07:18:14.235173 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 07:18:14 crc kubenswrapper[4749]: I0320 07:18:14.285638 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 07:18:14 crc kubenswrapper[4749]: I0320 07:18:14.508829 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 07:18:14 crc kubenswrapper[4749]: I0320 07:18:14.636954 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 07:18:14 crc kubenswrapper[4749]: I0320 07:18:14.680554 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 07:18:14 crc kubenswrapper[4749]: I0320 07:18:14.805521 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 07:18:14 crc kubenswrapper[4749]: I0320 07:18:14.946727 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 07:18:14 crc kubenswrapper[4749]: I0320 07:18:14.958682 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 07:18:14 crc kubenswrapper[4749]: I0320 07:18:14.968681 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.046146 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.080478 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.103639 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.135872 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.267925 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566518-llhfx"] Mar 20 07:18:15 crc kubenswrapper[4749]: E0320 07:18:15.268405 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38b3f23d-6db5-4788-bcd5-810450677cd6" containerName="oauth-openshift" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.268510 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b3f23d-6db5-4788-bcd5-810450677cd6" containerName="oauth-openshift" Mar 20 07:18:15 crc kubenswrapper[4749]: E0320 07:18:15.268595 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30422da4-1696-4beb-be35-4216a61c897c" containerName="installer" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.268686 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="30422da4-1696-4beb-be35-4216a61c897c" containerName="installer" Mar 20 07:18:15 crc kubenswrapper[4749]: E0320 07:18:15.268830 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86a83215-d581-4836-852f-721e6ea3db4b" containerName="controller-manager" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.268920 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="86a83215-d581-4836-852f-721e6ea3db4b" containerName="controller-manager" Mar 20 07:18:15 crc kubenswrapper[4749]: E0320 07:18:15.269013 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46e89a9-b781-454b-8e2d-870e92825114" containerName="route-controller-manager" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.269096 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46e89a9-b781-454b-8e2d-870e92825114" containerName="route-controller-manager" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.269279 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="86a83215-d581-4836-852f-721e6ea3db4b" containerName="controller-manager" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.269403 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d46e89a9-b781-454b-8e2d-870e92825114" containerName="route-controller-manager" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.269491 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="38b3f23d-6db5-4788-bcd5-810450677cd6" containerName="oauth-openshift" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.269597 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="30422da4-1696-4beb-be35-4216a61c897c" containerName="installer" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.270074 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566518-llhfx" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.271833 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-75664d8567-9mxsx"] Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.272366 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.272579 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75664d8567-9mxsx" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.272656 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.272763 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.274336 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.274492 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.274573 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.274753 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.275157 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58f5c867fd-r8rn9"] Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.275542 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.275622 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58f5c867fd-r8rn9" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.275910 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.277712 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.278609 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.278655 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.278826 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.278915 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.278615 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.281543 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.327918 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.378802 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpfgd\" (UniqueName: \"kubernetes.io/projected/8cf7ea32-0ab2-4496-a420-111852825393-kube-api-access-hpfgd\") pod \"route-controller-manager-58f5c867fd-r8rn9\" (UID: \"8cf7ea32-0ab2-4496-a420-111852825393\") " pod="openshift-route-controller-manager/route-controller-manager-58f5c867fd-r8rn9" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.378876 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da644329-3e0d-46bc-b763-979b8d2d3926-serving-cert\") pod \"controller-manager-75664d8567-9mxsx\" (UID: \"da644329-3e0d-46bc-b763-979b8d2d3926\") " pod="openshift-controller-manager/controller-manager-75664d8567-9mxsx" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.378933 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da644329-3e0d-46bc-b763-979b8d2d3926-config\") pod \"controller-manager-75664d8567-9mxsx\" (UID: \"da644329-3e0d-46bc-b763-979b8d2d3926\") " pod="openshift-controller-manager/controller-manager-75664d8567-9mxsx" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.378978 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvcl6\" (UniqueName: \"kubernetes.io/projected/0fef1b07-a814-496c-913e-301e76688b96-kube-api-access-gvcl6\") pod \"auto-csr-approver-29566518-llhfx\" (UID: \"0fef1b07-a814-496c-913e-301e76688b96\") " pod="openshift-infra/auto-csr-approver-29566518-llhfx" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.379041 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cf7ea32-0ab2-4496-a420-111852825393-serving-cert\") pod \"route-controller-manager-58f5c867fd-r8rn9\" (UID: \"8cf7ea32-0ab2-4496-a420-111852825393\") " pod="openshift-route-controller-manager/route-controller-manager-58f5c867fd-r8rn9" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.379073 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwdqb\" (UniqueName: \"kubernetes.io/projected/da644329-3e0d-46bc-b763-979b8d2d3926-kube-api-access-xwdqb\") pod \"controller-manager-75664d8567-9mxsx\" (UID: \"da644329-3e0d-46bc-b763-979b8d2d3926\") " pod="openshift-controller-manager/controller-manager-75664d8567-9mxsx" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.379113 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da644329-3e0d-46bc-b763-979b8d2d3926-proxy-ca-bundles\") pod \"controller-manager-75664d8567-9mxsx\" (UID: \"da644329-3e0d-46bc-b763-979b8d2d3926\") " pod="openshift-controller-manager/controller-manager-75664d8567-9mxsx" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.379176 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cf7ea32-0ab2-4496-a420-111852825393-client-ca\") pod \"route-controller-manager-58f5c867fd-r8rn9\" (UID: \"8cf7ea32-0ab2-4496-a420-111852825393\") " pod="openshift-route-controller-manager/route-controller-manager-58f5c867fd-r8rn9" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.379222 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cf7ea32-0ab2-4496-a420-111852825393-config\") pod \"route-controller-manager-58f5c867fd-r8rn9\" (UID: \"8cf7ea32-0ab2-4496-a420-111852825393\") " pod="openshift-route-controller-manager/route-controller-manager-58f5c867fd-r8rn9" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.379269 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da644329-3e0d-46bc-b763-979b8d2d3926-client-ca\") pod \"controller-manager-75664d8567-9mxsx\" (UID: \"da644329-3e0d-46bc-b763-979b8d2d3926\") " pod="openshift-controller-manager/controller-manager-75664d8567-9mxsx" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.480569 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cf7ea32-0ab2-4496-a420-111852825393-serving-cert\") pod \"route-controller-manager-58f5c867fd-r8rn9\" (UID: \"8cf7ea32-0ab2-4496-a420-111852825393\") " pod="openshift-route-controller-manager/route-controller-manager-58f5c867fd-r8rn9" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.480643 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwdqb\" (UniqueName: \"kubernetes.io/projected/da644329-3e0d-46bc-b763-979b8d2d3926-kube-api-access-xwdqb\") pod \"controller-manager-75664d8567-9mxsx\" (UID: \"da644329-3e0d-46bc-b763-979b8d2d3926\") " pod="openshift-controller-manager/controller-manager-75664d8567-9mxsx" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.480685 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da644329-3e0d-46bc-b763-979b8d2d3926-proxy-ca-bundles\") pod \"controller-manager-75664d8567-9mxsx\" (UID: \"da644329-3e0d-46bc-b763-979b8d2d3926\") " pod="openshift-controller-manager/controller-manager-75664d8567-9mxsx" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.480739 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cf7ea32-0ab2-4496-a420-111852825393-client-ca\") pod \"route-controller-manager-58f5c867fd-r8rn9\" (UID: \"8cf7ea32-0ab2-4496-a420-111852825393\") " pod="openshift-route-controller-manager/route-controller-manager-58f5c867fd-r8rn9" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.480789 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cf7ea32-0ab2-4496-a420-111852825393-config\") pod \"route-controller-manager-58f5c867fd-r8rn9\" (UID: \"8cf7ea32-0ab2-4496-a420-111852825393\") " pod="openshift-route-controller-manager/route-controller-manager-58f5c867fd-r8rn9" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.480818 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da644329-3e0d-46bc-b763-979b8d2d3926-client-ca\") pod \"controller-manager-75664d8567-9mxsx\" (UID: \"da644329-3e0d-46bc-b763-979b8d2d3926\") " pod="openshift-controller-manager/controller-manager-75664d8567-9mxsx" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.480878 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpfgd\" (UniqueName: \"kubernetes.io/projected/8cf7ea32-0ab2-4496-a420-111852825393-kube-api-access-hpfgd\") pod \"route-controller-manager-58f5c867fd-r8rn9\" (UID: \"8cf7ea32-0ab2-4496-a420-111852825393\") " pod="openshift-route-controller-manager/route-controller-manager-58f5c867fd-r8rn9" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.480910 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da644329-3e0d-46bc-b763-979b8d2d3926-serving-cert\") pod \"controller-manager-75664d8567-9mxsx\" (UID: \"da644329-3e0d-46bc-b763-979b8d2d3926\") " pod="openshift-controller-manager/controller-manager-75664d8567-9mxsx" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.480956 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da644329-3e0d-46bc-b763-979b8d2d3926-config\") pod \"controller-manager-75664d8567-9mxsx\" (UID: \"da644329-3e0d-46bc-b763-979b8d2d3926\") " pod="openshift-controller-manager/controller-manager-75664d8567-9mxsx" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.481010 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvcl6\" (UniqueName: \"kubernetes.io/projected/0fef1b07-a814-496c-913e-301e76688b96-kube-api-access-gvcl6\") pod \"auto-csr-approver-29566518-llhfx\" (UID: \"0fef1b07-a814-496c-913e-301e76688b96\") " pod="openshift-infra/auto-csr-approver-29566518-llhfx" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.482049 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cf7ea32-0ab2-4496-a420-111852825393-client-ca\") pod \"route-controller-manager-58f5c867fd-r8rn9\" (UID: \"8cf7ea32-0ab2-4496-a420-111852825393\") " pod="openshift-route-controller-manager/route-controller-manager-58f5c867fd-r8rn9" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.482660 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da644329-3e0d-46bc-b763-979b8d2d3926-client-ca\") pod \"controller-manager-75664d8567-9mxsx\" (UID: \"da644329-3e0d-46bc-b763-979b8d2d3926\") " pod="openshift-controller-manager/controller-manager-75664d8567-9mxsx" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.482688 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cf7ea32-0ab2-4496-a420-111852825393-config\") pod \"route-controller-manager-58f5c867fd-r8rn9\" (UID: \"8cf7ea32-0ab2-4496-a420-111852825393\") " pod="openshift-route-controller-manager/route-controller-manager-58f5c867fd-r8rn9" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.482992 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da644329-3e0d-46bc-b763-979b8d2d3926-config\") pod \"controller-manager-75664d8567-9mxsx\" (UID: \"da644329-3e0d-46bc-b763-979b8d2d3926\") " pod="openshift-controller-manager/controller-manager-75664d8567-9mxsx" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.485032 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da644329-3e0d-46bc-b763-979b8d2d3926-proxy-ca-bundles\") pod \"controller-manager-75664d8567-9mxsx\" (UID: \"da644329-3e0d-46bc-b763-979b8d2d3926\") " pod="openshift-controller-manager/controller-manager-75664d8567-9mxsx" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.489130 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da644329-3e0d-46bc-b763-979b8d2d3926-serving-cert\") pod \"controller-manager-75664d8567-9mxsx\" (UID: \"da644329-3e0d-46bc-b763-979b8d2d3926\") " pod="openshift-controller-manager/controller-manager-75664d8567-9mxsx" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.496188 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.496647 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.499412 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cf7ea32-0ab2-4496-a420-111852825393-serving-cert\") pod \"route-controller-manager-58f5c867fd-r8rn9\" (UID: \"8cf7ea32-0ab2-4496-a420-111852825393\") " pod="openshift-route-controller-manager/route-controller-manager-58f5c867fd-r8rn9" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.501636 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpfgd\" (UniqueName: \"kubernetes.io/projected/8cf7ea32-0ab2-4496-a420-111852825393-kube-api-access-hpfgd\") pod \"route-controller-manager-58f5c867fd-r8rn9\" (UID: \"8cf7ea32-0ab2-4496-a420-111852825393\") " pod="openshift-route-controller-manager/route-controller-manager-58f5c867fd-r8rn9" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.505613 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwdqb\" (UniqueName: \"kubernetes.io/projected/da644329-3e0d-46bc-b763-979b8d2d3926-kube-api-access-xwdqb\") pod \"controller-manager-75664d8567-9mxsx\" (UID: \"da644329-3e0d-46bc-b763-979b8d2d3926\") " pod="openshift-controller-manager/controller-manager-75664d8567-9mxsx" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.507038 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvcl6\" (UniqueName: \"kubernetes.io/projected/0fef1b07-a814-496c-913e-301e76688b96-kube-api-access-gvcl6\") pod \"auto-csr-approver-29566518-llhfx\" (UID: \"0fef1b07-a814-496c-913e-301e76688b96\") " pod="openshift-infra/auto-csr-approver-29566518-llhfx" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.518273 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.591336 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566518-llhfx" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.595830 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.602331 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75664d8567-9mxsx" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.608854 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58f5c867fd-r8rn9" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.628519 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.661988 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.787209 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.827779 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.896032 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.901755 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 07:18:15 crc kubenswrapper[4749]: I0320 07:18:15.997869 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 07:18:16 crc kubenswrapper[4749]: I0320 07:18:16.020146 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 07:18:16 crc kubenswrapper[4749]: I0320 07:18:16.070422 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 07:18:16 crc kubenswrapper[4749]: I0320 07:18:16.090664 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 07:18:16 crc kubenswrapper[4749]: I0320 07:18:16.235435 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 07:18:16 crc kubenswrapper[4749]: I0320 07:18:16.340546 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 07:18:16 crc kubenswrapper[4749]: I0320 07:18:16.349097 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 07:18:16 crc kubenswrapper[4749]: I0320 07:18:16.353605 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 07:18:16 crc kubenswrapper[4749]: I0320 07:18:16.366651 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 07:18:16 crc kubenswrapper[4749]: I0320 07:18:16.462534 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 07:18:16 crc kubenswrapper[4749]: I0320 07:18:16.536665 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 07:18:16 crc kubenswrapper[4749]: I0320 07:18:16.549531 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 07:18:16 crc kubenswrapper[4749]: I0320 07:18:16.572636 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 07:18:16 crc kubenswrapper[4749]: I0320 07:18:16.578310 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 07:18:16 crc kubenswrapper[4749]: I0320 07:18:16.597160 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 07:18:16 crc kubenswrapper[4749]: I0320 07:18:16.626298 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 07:18:16 crc kubenswrapper[4749]: I0320 07:18:16.744025 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 07:18:16 crc kubenswrapper[4749]: I0320 07:18:16.771477 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 07:18:16 crc kubenswrapper[4749]: I0320 07:18:16.949967 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 07:18:16 crc kubenswrapper[4749]: I0320 07:18:16.959878 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 07:18:17 crc kubenswrapper[4749]: I0320 07:18:17.029353 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 07:18:17 crc kubenswrapper[4749]: I0320 07:18:17.063242 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 07:18:17 crc kubenswrapper[4749]: I0320 07:18:17.076909 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 07:18:17 crc kubenswrapper[4749]: I0320 07:18:17.161596 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 07:18:17 crc kubenswrapper[4749]: I0320 07:18:17.177359 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 07:18:17 crc kubenswrapper[4749]: I0320 07:18:17.266012 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 07:18:17 crc kubenswrapper[4749]: I0320 07:18:17.331934 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 07:18:17 crc kubenswrapper[4749]: I0320 07:18:17.355152 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 07:18:17 crc kubenswrapper[4749]: I0320 07:18:17.394649 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 07:18:17 crc kubenswrapper[4749]: I0320 07:18:17.473383 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 07:18:17 crc kubenswrapper[4749]: I0320 07:18:17.473631 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 07:18:17 crc kubenswrapper[4749]: I0320 07:18:17.617726 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 07:18:17 crc kubenswrapper[4749]: I0320 07:18:17.635320 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 07:18:17 crc kubenswrapper[4749]: I0320 07:18:17.707876 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 07:18:17 crc kubenswrapper[4749]: I0320 07:18:17.712551 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 07:18:17 crc kubenswrapper[4749]: I0320 07:18:17.819469 4749 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 07:18:17 crc kubenswrapper[4749]: I0320 07:18:17.832788 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 07:18:17 crc kubenswrapper[4749]: I0320 07:18:17.914772 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 07:18:17 crc kubenswrapper[4749]: I0320 07:18:17.927495 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 07:18:17 crc kubenswrapper[4749]: I0320 07:18:17.991404 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 07:18:18 crc kubenswrapper[4749]: I0320 07:18:18.002623 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 07:18:18 crc kubenswrapper[4749]: I0320 07:18:18.043812 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 07:18:18 crc kubenswrapper[4749]: I0320 07:18:18.053641 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 07:18:18 crc kubenswrapper[4749]: I0320 07:18:18.168049 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 07:18:18 crc kubenswrapper[4749]: I0320 07:18:18.173667 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 07:18:18 crc kubenswrapper[4749]: I0320 07:18:18.175759 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 07:18:18 crc kubenswrapper[4749]: I0320 07:18:18.180271 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 07:18:18 crc kubenswrapper[4749]: I0320 07:18:18.386504 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 07:18:18 crc kubenswrapper[4749]: I0320 07:18:18.433235 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 07:18:18 crc kubenswrapper[4749]: I0320 07:18:18.481119 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 07:18:18 crc kubenswrapper[4749]: I0320 07:18:18.512979 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 07:18:18 crc kubenswrapper[4749]: I0320 07:18:18.614130 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 07:18:18 crc kubenswrapper[4749]: I0320 07:18:18.632908 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 07:18:18 crc kubenswrapper[4749]: I0320 07:18:18.726163 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 07:18:18 crc kubenswrapper[4749]: I0320 07:18:18.770656 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 07:18:18 crc kubenswrapper[4749]: I0320 07:18:18.823415 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 07:18:18 crc kubenswrapper[4749]: I0320 07:18:18.909793 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 07:18:19 crc kubenswrapper[4749]: I0320 07:18:19.014402 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 07:18:19 crc kubenswrapper[4749]: I0320 07:18:19.156793 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 07:18:19 crc kubenswrapper[4749]: I0320 07:18:19.192476 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 07:18:19 crc kubenswrapper[4749]: I0320 07:18:19.204535 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 07:18:19 crc kubenswrapper[4749]: I0320 07:18:19.212946 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 07:18:19 crc kubenswrapper[4749]: I0320 07:18:19.229863 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 07:18:19 crc kubenswrapper[4749]: I0320 07:18:19.355461 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 07:18:19 crc kubenswrapper[4749]: I0320 07:18:19.395777 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 07:18:19 crc kubenswrapper[4749]: I0320 07:18:19.418556 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 07:18:19 crc kubenswrapper[4749]: I0320 07:18:19.424462 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 07:18:19 crc kubenswrapper[4749]: I0320 07:18:19.431839 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 07:18:19 crc kubenswrapper[4749]: I0320 07:18:19.497944 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 07:18:19 crc kubenswrapper[4749]: I0320 07:18:19.508228 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 07:18:19 crc kubenswrapper[4749]: I0320 07:18:19.731007 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 07:18:19 crc kubenswrapper[4749]: I0320 07:18:19.772775 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 07:18:19 crc kubenswrapper[4749]: I0320 07:18:19.805203 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 07:18:19 crc kubenswrapper[4749]: I0320 07:18:19.938660 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 07:18:20 crc kubenswrapper[4749]: I0320 07:18:20.026202 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 07:18:20 crc kubenswrapper[4749]: I0320 07:18:20.104331 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 07:18:20 crc kubenswrapper[4749]: I0320 07:18:20.156202 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 07:18:20 crc kubenswrapper[4749]: I0320 07:18:20.179962 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 07:18:20 crc kubenswrapper[4749]: I0320 07:18:20.223244 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 07:18:20 crc kubenswrapper[4749]: I0320 07:18:20.305801 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 07:18:20 crc kubenswrapper[4749]: I0320 07:18:20.342589 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 07:18:20 crc kubenswrapper[4749]: I0320 07:18:20.433834 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 07:18:20 crc kubenswrapper[4749]: I0320 07:18:20.574565 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 07:18:20 crc kubenswrapper[4749]: I0320 07:18:20.610640 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 07:18:20 crc kubenswrapper[4749]: I0320 07:18:20.662891 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 07:18:20 crc kubenswrapper[4749]: I0320 07:18:20.683712 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 07:18:20 crc kubenswrapper[4749]: I0320 07:18:20.767122 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 07:18:20 crc kubenswrapper[4749]: I0320 07:18:20.892031 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 07:18:20 crc kubenswrapper[4749]: I0320 07:18:20.931744 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 07:18:20 crc kubenswrapper[4749]: I0320 07:18:20.938412 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 07:18:20 crc kubenswrapper[4749]: I0320 07:18:20.965638 4749 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 07:18:20 crc kubenswrapper[4749]: I0320 07:18:20.966032 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://5f79f11fe5f1911b3210b36c9a630f224a7c92db0f2ba3a961bdb7d93f736d32" gracePeriod=5 Mar 20 07:18:21 crc kubenswrapper[4749]: I0320 07:18:21.152910 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 07:18:21 crc kubenswrapper[4749]: I0320 07:18:21.183707 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 07:18:21 crc kubenswrapper[4749]: I0320 07:18:21.189203 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 07:18:21 crc kubenswrapper[4749]: I0320 07:18:21.226471 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 07:18:21 crc kubenswrapper[4749]: I0320 07:18:21.442615 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 07:18:21 crc kubenswrapper[4749]: I0320 07:18:21.597011 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 07:18:22 crc kubenswrapper[4749]: I0320 07:18:22.029228 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 07:18:22 crc kubenswrapper[4749]: I0320 07:18:22.196949 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 07:18:22 crc kubenswrapper[4749]: I0320 07:18:22.348651 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 07:18:22 crc kubenswrapper[4749]: I0320 07:18:22.349245 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 07:18:22 crc kubenswrapper[4749]: I0320 07:18:22.378193 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 07:18:22 crc kubenswrapper[4749]: I0320 07:18:22.423528 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 07:18:22 crc kubenswrapper[4749]: I0320 07:18:22.454850 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 07:18:22 crc kubenswrapper[4749]: I0320 07:18:22.465246 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 07:18:22 crc kubenswrapper[4749]: I0320 07:18:22.615325 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 07:18:22 crc kubenswrapper[4749]: I0320 07:18:22.808050 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 07:18:22 crc kubenswrapper[4749]: I0320 07:18:22.980047 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 07:18:23 crc kubenswrapper[4749]: I0320 07:18:23.047195 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 07:18:23 crc kubenswrapper[4749]: I0320 07:18:23.125099 4749 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 07:18:23 crc kubenswrapper[4749]: I0320 07:18:23.178203 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 07:18:23 crc kubenswrapper[4749]: I0320 07:18:23.222515 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 07:18:23 crc kubenswrapper[4749]: I0320 07:18:23.392717 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75664d8567-9mxsx"] Mar 20 07:18:23 crc kubenswrapper[4749]: I0320 07:18:23.396568 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566518-llhfx"] Mar 20 07:18:23 crc kubenswrapper[4749]: I0320 07:18:23.412488 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58f5c867fd-r8rn9"] Mar 20 07:18:23 crc kubenswrapper[4749]: I0320 07:18:23.642084 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 07:18:23 crc kubenswrapper[4749]: I0320 07:18:23.664125 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 07:18:23 crc kubenswrapper[4749]: I0320 07:18:23.681250 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 07:18:23 crc kubenswrapper[4749]: I0320 07:18:23.729682 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 07:18:23 crc kubenswrapper[4749]: I0320 07:18:23.772701 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58f5c867fd-r8rn9"] Mar 20 07:18:23 crc kubenswrapper[4749]: I0320 07:18:23.829534 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566518-llhfx"] Mar 20 07:18:23 crc kubenswrapper[4749]: W0320 07:18:23.836981 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fef1b07_a814_496c_913e_301e76688b96.slice/crio-fdf3e183829e4f5813909741459fd458d8b5167f175419e247603ed8d4c69ac7 WatchSource:0}: Error finding container fdf3e183829e4f5813909741459fd458d8b5167f175419e247603ed8d4c69ac7: Status 404 returned error can't find the container with id fdf3e183829e4f5813909741459fd458d8b5167f175419e247603ed8d4c69ac7 Mar 20 07:18:23 crc kubenswrapper[4749]: I0320 07:18:23.842240 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75664d8567-9mxsx"] Mar 20 07:18:23 crc kubenswrapper[4749]: W0320 07:18:23.853122 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda644329_3e0d_46bc_b763_979b8d2d3926.slice/crio-1870aaef81ba72163243ede8f35b94e24ddaa7904b8769da0c8d6ec2ae4e9e81 WatchSource:0}: Error finding container 1870aaef81ba72163243ede8f35b94e24ddaa7904b8769da0c8d6ec2ae4e9e81: Status 404 returned error can't find the container with id 1870aaef81ba72163243ede8f35b94e24ddaa7904b8769da0c8d6ec2ae4e9e81 Mar 20 07:18:23 crc kubenswrapper[4749]: I0320 07:18:23.945558 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 07:18:23 crc kubenswrapper[4749]: I0320 07:18:23.955106 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 07:18:24 crc kubenswrapper[4749]: I0320 07:18:24.042162 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75664d8567-9mxsx" event={"ID":"da644329-3e0d-46bc-b763-979b8d2d3926","Type":"ContainerStarted","Data":"c151a46b45608eb26d4b1190dc852a26332a7751eb3952e8efa395e2dfb2dd9b"} Mar 20 07:18:24 crc kubenswrapper[4749]: I0320 07:18:24.042659 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-75664d8567-9mxsx" Mar 20 07:18:24 crc kubenswrapper[4749]: I0320 07:18:24.042722 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75664d8567-9mxsx" event={"ID":"da644329-3e0d-46bc-b763-979b8d2d3926","Type":"ContainerStarted","Data":"1870aaef81ba72163243ede8f35b94e24ddaa7904b8769da0c8d6ec2ae4e9e81"} Mar 20 07:18:24 crc kubenswrapper[4749]: I0320 07:18:24.044313 4749 patch_prober.go:28] interesting pod/controller-manager-75664d8567-9mxsx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" start-of-body= Mar 20 07:18:24 crc kubenswrapper[4749]: I0320 07:18:24.044356 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-75664d8567-9mxsx" podUID="da644329-3e0d-46bc-b763-979b8d2d3926" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" Mar 20 07:18:24 crc kubenswrapper[4749]: I0320 07:18:24.045133 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58f5c867fd-r8rn9" event={"ID":"8cf7ea32-0ab2-4496-a420-111852825393","Type":"ContainerStarted","Data":"298d7d06ea551b6f35c9dac1c71ea1515e9da722f667cec47d5857c846464a13"} Mar 20 07:18:24 crc kubenswrapper[4749]: I0320 07:18:24.045166 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58f5c867fd-r8rn9" event={"ID":"8cf7ea32-0ab2-4496-a420-111852825393","Type":"ContainerStarted","Data":"50cf4bd76771edd3103d5a2c3d6d2ce000e820448012bb0e5b1540a59afef17e"} Mar 20 07:18:24 crc kubenswrapper[4749]: I0320 07:18:24.045620 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-58f5c867fd-r8rn9" Mar 20 07:18:24 crc kubenswrapper[4749]: I0320 07:18:24.046972 4749 patch_prober.go:28] interesting pod/route-controller-manager-58f5c867fd-r8rn9 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": dial tcp 10.217.0.68:8443: connect: connection refused" start-of-body= Mar 20 07:18:24 crc kubenswrapper[4749]: I0320 07:18:24.047070 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-58f5c867fd-r8rn9" podUID="8cf7ea32-0ab2-4496-a420-111852825393" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": dial tcp 10.217.0.68:8443: connect: connection refused" Mar 20 07:18:24 crc kubenswrapper[4749]: I0320 07:18:24.048437 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566518-llhfx" event={"ID":"0fef1b07-a814-496c-913e-301e76688b96","Type":"ContainerStarted","Data":"fdf3e183829e4f5813909741459fd458d8b5167f175419e247603ed8d4c69ac7"} Mar 20 07:18:24 crc kubenswrapper[4749]: I0320 07:18:24.068219 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-75664d8567-9mxsx" podStartSLOduration=48.068191771 podStartE2EDuration="48.068191771s" podCreationTimestamp="2026-03-20 07:17:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:18:24.062511908 +0000 UTC m=+340.612169555" watchObservedRunningTime="2026-03-20 07:18:24.068191771 +0000 UTC m=+340.617849428" Mar 20 07:18:24 crc kubenswrapper[4749]: I0320 07:18:24.084039 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-58f5c867fd-r8rn9" podStartSLOduration=48.084018739 podStartE2EDuration="48.084018739s" podCreationTimestamp="2026-03-20 07:17:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:18:24.080689331 +0000 UTC m=+340.630346988" watchObservedRunningTime="2026-03-20 07:18:24.084018739 +0000 UTC m=+340.633676386" Mar 20 07:18:24 crc kubenswrapper[4749]: I0320 07:18:24.196652 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 07:18:24 crc kubenswrapper[4749]: I0320 07:18:24.364711 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 07:18:24 crc kubenswrapper[4749]: I0320 07:18:24.420716 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 07:18:24 crc kubenswrapper[4749]: I0320 07:18:24.453364 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 07:18:24 crc kubenswrapper[4749]: I0320 07:18:24.749412 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 07:18:24 crc kubenswrapper[4749]: I0320 07:18:24.803626 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 07:18:24 crc kubenswrapper[4749]: I0320 07:18:24.919309 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 07:18:25 crc kubenswrapper[4749]: I0320 07:18:25.025657 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 07:18:25 crc kubenswrapper[4749]: I0320 07:18:25.059444 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-58f5c867fd-r8rn9" Mar 20 07:18:25 crc kubenswrapper[4749]: I0320 07:18:25.060571 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-75664d8567-9mxsx" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.062573 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.062624 4749 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="5f79f11fe5f1911b3210b36c9a630f224a7c92db0f2ba3a961bdb7d93f736d32" exitCode=137 Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.063202 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da9be3cf5eb6b6be6b05a68dd94d52285f1cd6b97b304ac5e9bbddf64dd4d52d" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.118101 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7484f6b95f-zpp2f"] Mar 20 07:18:26 crc kubenswrapper[4749]: E0320 07:18:26.118371 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.118387 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.118520 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.119004 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.119235 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.119338 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.123887 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.123893 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.124087 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.124325 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.124661 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.125403 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.128401 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.128537 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.128854 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.130461 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.130662 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7484f6b95f-zpp2f"] Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.130774 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.131874 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.138987 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.142871 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.147690 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.183933 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.203666 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.203708 4749 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="9a86d937-d789-4d8f-b3a2-0f0eacb2e601" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.206243 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.206304 4749 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="9a86d937-d789-4d8f-b3a2-0f0eacb2e601" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.278953 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.279002 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.279047 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.279125 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.279144 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.279273 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-system-router-certs\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.279317 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.279338 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.279362 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.279388 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-user-template-error\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.279409 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv5br\" (UniqueName: \"kubernetes.io/projected/77c6f57d-a893-4d26-b19f-d6690c6523a7-kube-api-access-cv5br\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.279433 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-user-template-login\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.279457 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.279485 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/77c6f57d-a893-4d26-b19f-d6690c6523a7-audit-policies\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.279522 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-system-service-ca\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.279567 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/77c6f57d-a893-4d26-b19f-d6690c6523a7-audit-dir\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.279585 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-system-session\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.279600 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.279616 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.279703 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.279725 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.280064 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.280195 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.288094 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.380214 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/77c6f57d-a893-4d26-b19f-d6690c6523a7-audit-policies\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.380271 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-system-service-ca\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.380337 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/77c6f57d-a893-4d26-b19f-d6690c6523a7-audit-dir\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.380355 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-system-session\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.380373 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.380391 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.380421 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-system-router-certs\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.380444 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.380462 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.380484 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.380511 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-user-template-error\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.380539 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv5br\" (UniqueName: \"kubernetes.io/projected/77c6f57d-a893-4d26-b19f-d6690c6523a7-kube-api-access-cv5br\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.380566 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-user-template-login\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.380590 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.380637 4749 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.380648 4749 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.380658 4749 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.380666 4749 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.380675 4749 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.380981 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/77c6f57d-a893-4d26-b19f-d6690c6523a7-audit-policies\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.381060 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/77c6f57d-a893-4d26-b19f-d6690c6523a7-audit-dir\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.381224 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.381800 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.382461 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-system-service-ca\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.384561 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.385032 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-system-router-certs\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.385158 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.386582 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-user-template-error\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.386673 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-system-session\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.386998 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-user-template-login\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.390248 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.385682 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/77c6f57d-a893-4d26-b19f-d6690c6523a7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.398983 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv5br\" (UniqueName: \"kubernetes.io/projected/77c6f57d-a893-4d26-b19f-d6690c6523a7-kube-api-access-cv5br\") pod \"oauth-openshift-7484f6b95f-zpp2f\" (UID: \"77c6f57d-a893-4d26-b19f-d6690c6523a7\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.470710 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:26 crc kubenswrapper[4749]: I0320 07:18:26.890392 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7484f6b95f-zpp2f"] Mar 20 07:18:26 crc kubenswrapper[4749]: W0320 07:18:26.904371 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77c6f57d_a893_4d26_b19f_d6690c6523a7.slice/crio-aced8e3c32e466fe04a6d59c1da8772851a8198c31ee9ccbe0c8e22cfb67d8f0 WatchSource:0}: Error finding container aced8e3c32e466fe04a6d59c1da8772851a8198c31ee9ccbe0c8e22cfb67d8f0: Status 404 returned error can't find the container with id aced8e3c32e466fe04a6d59c1da8772851a8198c31ee9ccbe0c8e22cfb67d8f0 Mar 20 07:18:27 crc kubenswrapper[4749]: I0320 07:18:27.067799 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" event={"ID":"77c6f57d-a893-4d26-b19f-d6690c6523a7","Type":"ContainerStarted","Data":"aced8e3c32e466fe04a6d59c1da8772851a8198c31ee9ccbe0c8e22cfb67d8f0"} Mar 20 07:18:27 crc kubenswrapper[4749]: I0320 07:18:27.069597 4749 generic.go:334] "Generic (PLEG): container finished" podID="0fef1b07-a814-496c-913e-301e76688b96" containerID="8d8d5ff9976d99ae7241f0b9fbb00a5a21a77065bbbc9798582616722f36caf2" exitCode=0 Mar 20 07:18:27 crc kubenswrapper[4749]: I0320 07:18:27.069695 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 07:18:27 crc kubenswrapper[4749]: I0320 07:18:27.069733 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566518-llhfx" event={"ID":"0fef1b07-a814-496c-913e-301e76688b96","Type":"ContainerDied","Data":"8d8d5ff9976d99ae7241f0b9fbb00a5a21a77065bbbc9798582616722f36caf2"} Mar 20 07:18:28 crc kubenswrapper[4749]: I0320 07:18:28.079502 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" event={"ID":"77c6f57d-a893-4d26-b19f-d6690c6523a7","Type":"ContainerStarted","Data":"c0396da07c2be8db2ab3fa20c5479ada8d8318dc70827ac9b5913971ca1bbe4c"} Mar 20 07:18:28 crc kubenswrapper[4749]: I0320 07:18:28.082628 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:28 crc kubenswrapper[4749]: I0320 07:18:28.092726 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" Mar 20 07:18:28 crc kubenswrapper[4749]: I0320 07:18:28.122719 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7484f6b95f-zpp2f" podStartSLOduration=69.122670257 podStartE2EDuration="1m9.122670257s" podCreationTimestamp="2026-03-20 07:17:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:18:28.11742282 +0000 UTC m=+344.667080547" watchObservedRunningTime="2026-03-20 07:18:28.122670257 +0000 UTC m=+344.672327904" Mar 20 07:18:28 crc kubenswrapper[4749]: I0320 07:18:28.192837 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 07:18:28 crc kubenswrapper[4749]: I0320 07:18:28.373190 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566518-llhfx" Mar 20 07:18:28 crc kubenswrapper[4749]: I0320 07:18:28.412307 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvcl6\" (UniqueName: \"kubernetes.io/projected/0fef1b07-a814-496c-913e-301e76688b96-kube-api-access-gvcl6\") pod \"0fef1b07-a814-496c-913e-301e76688b96\" (UID: \"0fef1b07-a814-496c-913e-301e76688b96\") " Mar 20 07:18:28 crc kubenswrapper[4749]: I0320 07:18:28.419364 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fef1b07-a814-496c-913e-301e76688b96-kube-api-access-gvcl6" (OuterVolumeSpecName: "kube-api-access-gvcl6") pod "0fef1b07-a814-496c-913e-301e76688b96" (UID: "0fef1b07-a814-496c-913e-301e76688b96"). InnerVolumeSpecName "kube-api-access-gvcl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:18:28 crc kubenswrapper[4749]: I0320 07:18:28.513735 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvcl6\" (UniqueName: \"kubernetes.io/projected/0fef1b07-a814-496c-913e-301e76688b96-kube-api-access-gvcl6\") on node \"crc\" DevicePath \"\"" Mar 20 07:18:29 crc kubenswrapper[4749]: I0320 07:18:29.087664 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566518-llhfx" Mar 20 07:18:29 crc kubenswrapper[4749]: I0320 07:18:29.088585 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566518-llhfx" event={"ID":"0fef1b07-a814-496c-913e-301e76688b96","Type":"ContainerDied","Data":"fdf3e183829e4f5813909741459fd458d8b5167f175419e247603ed8d4c69ac7"} Mar 20 07:18:29 crc kubenswrapper[4749]: I0320 07:18:29.088643 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdf3e183829e4f5813909741459fd458d8b5167f175419e247603ed8d4c69ac7" Mar 20 07:18:46 crc kubenswrapper[4749]: I0320 07:18:46.209127 4749 generic.go:334] "Generic (PLEG): container finished" podID="00730545-e9b7-4166-9f09-7a6fcac8cad3" containerID="67cb77d5c367914c438c14e7a50a60f7b25514b9f084346fde48e6b9dcbfb6c7" exitCode=0 Mar 20 07:18:46 crc kubenswrapper[4749]: I0320 07:18:46.209495 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" event={"ID":"00730545-e9b7-4166-9f09-7a6fcac8cad3","Type":"ContainerDied","Data":"67cb77d5c367914c438c14e7a50a60f7b25514b9f084346fde48e6b9dcbfb6c7"} Mar 20 07:18:46 crc kubenswrapper[4749]: I0320 07:18:46.210595 4749 scope.go:117] "RemoveContainer" containerID="67cb77d5c367914c438c14e7a50a60f7b25514b9f084346fde48e6b9dcbfb6c7" Mar 20 07:18:47 crc kubenswrapper[4749]: I0320 07:18:47.219634 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" event={"ID":"00730545-e9b7-4166-9f09-7a6fcac8cad3","Type":"ContainerStarted","Data":"8c2a06811a45e54524da4727153d331ebe96f784206d3bcc1fde7d8744eb21ff"} Mar 20 07:18:47 crc kubenswrapper[4749]: I0320 07:18:47.220924 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" Mar 20 07:18:47 crc kubenswrapper[4749]: I0320 07:18:47.222446 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" Mar 20 07:18:52 crc kubenswrapper[4749]: I0320 07:18:52.300272 4749 scope.go:117] "RemoveContainer" containerID="b7a9d3d56425dd88c89608d446f6d44c5f90644cea243dd023e74c5630a0a99e" Mar 20 07:18:52 crc kubenswrapper[4749]: I0320 07:18:52.327577 4749 scope.go:117] "RemoveContainer" containerID="12cb6e64ecd020e07bd8f22e52fcf960c975a09da0f06a9f43daf5bfbff01de3" Mar 20 07:18:52 crc kubenswrapper[4749]: I0320 07:18:52.348544 4749 scope.go:117] "RemoveContainer" containerID="21e71bf5e132166e8d3e2f33eb325502e54ff36380220a07917135b27ebe41c6" Mar 20 07:18:52 crc kubenswrapper[4749]: I0320 07:18:52.362410 4749 scope.go:117] "RemoveContainer" containerID="b36dd931c1b5cad253e893ec8aa896786690c921285a4733fd5f3fed3db01ce7" Mar 20 07:18:52 crc kubenswrapper[4749]: I0320 07:18:52.388333 4749 scope.go:117] "RemoveContainer" containerID="5b332a4612c6855c57c6c15a305a1f56099dab01f849027ea2eeda56718010cc" Mar 20 07:19:04 crc kubenswrapper[4749]: I0320 07:19:04.514370 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:19:04 crc kubenswrapper[4749]: I0320 07:19:04.514822 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:19:34 crc kubenswrapper[4749]: I0320 07:19:34.514589 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:19:34 crc kubenswrapper[4749]: I0320 07:19:34.517071 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.233592 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2q8qh"] Mar 20 07:19:53 crc kubenswrapper[4749]: E0320 07:19:53.234241 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fef1b07-a814-496c-913e-301e76688b96" containerName="oc" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.234255 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fef1b07-a814-496c-913e-301e76688b96" containerName="oc" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.234397 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fef1b07-a814-496c-913e-301e76688b96" containerName="oc" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.234815 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.301297 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2q8qh"] Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.331130 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwrng\" (UniqueName: \"kubernetes.io/projected/9035f45a-7cff-4404-bd23-e4ced58d240e-kube-api-access-wwrng\") pod \"image-registry-66df7c8f76-2q8qh\" (UID: \"9035f45a-7cff-4404-bd23-e4ced58d240e\") " pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.331185 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-2q8qh\" (UID: \"9035f45a-7cff-4404-bd23-e4ced58d240e\") " pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.331208 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9035f45a-7cff-4404-bd23-e4ced58d240e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2q8qh\" (UID: \"9035f45a-7cff-4404-bd23-e4ced58d240e\") " pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.331675 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9035f45a-7cff-4404-bd23-e4ced58d240e-bound-sa-token\") pod \"image-registry-66df7c8f76-2q8qh\" (UID: \"9035f45a-7cff-4404-bd23-e4ced58d240e\") " pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.331808 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9035f45a-7cff-4404-bd23-e4ced58d240e-registry-certificates\") pod \"image-registry-66df7c8f76-2q8qh\" (UID: \"9035f45a-7cff-4404-bd23-e4ced58d240e\") " pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.331914 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9035f45a-7cff-4404-bd23-e4ced58d240e-trusted-ca\") pod \"image-registry-66df7c8f76-2q8qh\" (UID: \"9035f45a-7cff-4404-bd23-e4ced58d240e\") " pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.332062 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9035f45a-7cff-4404-bd23-e4ced58d240e-registry-tls\") pod \"image-registry-66df7c8f76-2q8qh\" (UID: \"9035f45a-7cff-4404-bd23-e4ced58d240e\") " pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.332147 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9035f45a-7cff-4404-bd23-e4ced58d240e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2q8qh\" (UID: \"9035f45a-7cff-4404-bd23-e4ced58d240e\") " pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.356823 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-2q8qh\" (UID: \"9035f45a-7cff-4404-bd23-e4ced58d240e\") " pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.433796 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwrng\" (UniqueName: \"kubernetes.io/projected/9035f45a-7cff-4404-bd23-e4ced58d240e-kube-api-access-wwrng\") pod \"image-registry-66df7c8f76-2q8qh\" (UID: \"9035f45a-7cff-4404-bd23-e4ced58d240e\") " pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.433845 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9035f45a-7cff-4404-bd23-e4ced58d240e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2q8qh\" (UID: \"9035f45a-7cff-4404-bd23-e4ced58d240e\") " pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.433872 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9035f45a-7cff-4404-bd23-e4ced58d240e-bound-sa-token\") pod \"image-registry-66df7c8f76-2q8qh\" (UID: \"9035f45a-7cff-4404-bd23-e4ced58d240e\") " pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.433896 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9035f45a-7cff-4404-bd23-e4ced58d240e-registry-certificates\") pod \"image-registry-66df7c8f76-2q8qh\" (UID: \"9035f45a-7cff-4404-bd23-e4ced58d240e\") " pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.433915 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9035f45a-7cff-4404-bd23-e4ced58d240e-trusted-ca\") pod \"image-registry-66df7c8f76-2q8qh\" (UID: \"9035f45a-7cff-4404-bd23-e4ced58d240e\") " pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.433939 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9035f45a-7cff-4404-bd23-e4ced58d240e-registry-tls\") pod \"image-registry-66df7c8f76-2q8qh\" (UID: \"9035f45a-7cff-4404-bd23-e4ced58d240e\") " pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.433954 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9035f45a-7cff-4404-bd23-e4ced58d240e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2q8qh\" (UID: \"9035f45a-7cff-4404-bd23-e4ced58d240e\") " pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.434386 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9035f45a-7cff-4404-bd23-e4ced58d240e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2q8qh\" (UID: \"9035f45a-7cff-4404-bd23-e4ced58d240e\") " pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.436509 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9035f45a-7cff-4404-bd23-e4ced58d240e-registry-certificates\") pod \"image-registry-66df7c8f76-2q8qh\" (UID: \"9035f45a-7cff-4404-bd23-e4ced58d240e\") " pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.437169 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9035f45a-7cff-4404-bd23-e4ced58d240e-trusted-ca\") pod \"image-registry-66df7c8f76-2q8qh\" (UID: \"9035f45a-7cff-4404-bd23-e4ced58d240e\") " pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.441661 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9035f45a-7cff-4404-bd23-e4ced58d240e-registry-tls\") pod \"image-registry-66df7c8f76-2q8qh\" (UID: \"9035f45a-7cff-4404-bd23-e4ced58d240e\") " pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.441924 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9035f45a-7cff-4404-bd23-e4ced58d240e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2q8qh\" (UID: \"9035f45a-7cff-4404-bd23-e4ced58d240e\") " pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.456507 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwrng\" (UniqueName: \"kubernetes.io/projected/9035f45a-7cff-4404-bd23-e4ced58d240e-kube-api-access-wwrng\") pod \"image-registry-66df7c8f76-2q8qh\" (UID: \"9035f45a-7cff-4404-bd23-e4ced58d240e\") " pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.459208 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9035f45a-7cff-4404-bd23-e4ced58d240e-bound-sa-token\") pod \"image-registry-66df7c8f76-2q8qh\" (UID: \"9035f45a-7cff-4404-bd23-e4ced58d240e\") " pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:53 crc kubenswrapper[4749]: I0320 07:19:53.570998 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:54 crc kubenswrapper[4749]: I0320 07:19:54.035581 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2q8qh"] Mar 20 07:19:54 crc kubenswrapper[4749]: I0320 07:19:54.662555 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" event={"ID":"9035f45a-7cff-4404-bd23-e4ced58d240e","Type":"ContainerStarted","Data":"c55619a5bf43708e97772f69c31b26193bc8d042c08d0a340aceca48f09610ff"} Mar 20 07:19:54 crc kubenswrapper[4749]: I0320 07:19:54.662850 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:19:54 crc kubenswrapper[4749]: I0320 07:19:54.662866 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" event={"ID":"9035f45a-7cff-4404-bd23-e4ced58d240e","Type":"ContainerStarted","Data":"193f3165230402219e78190d6247417424c69e69911b823980efe9f884cd0712"} Mar 20 07:19:54 crc kubenswrapper[4749]: I0320 07:19:54.701842 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" podStartSLOduration=1.701815564 podStartE2EDuration="1.701815564s" podCreationTimestamp="2026-03-20 07:19:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:19:54.695952964 +0000 UTC m=+431.245610671" watchObservedRunningTime="2026-03-20 07:19:54.701815564 +0000 UTC m=+431.251473251" Mar 20 07:19:57 crc kubenswrapper[4749]: I0320 07:19:57.971504 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9ss5d"] Mar 20 07:19:57 crc kubenswrapper[4749]: I0320 07:19:57.972504 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9ss5d" podUID="a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8" containerName="registry-server" containerID="cri-o://96c3778c4c41d3e6222d522de25ac4d41a6d02827a8c73b25ec24e7347258c10" gracePeriod=30 Mar 20 07:19:57 crc kubenswrapper[4749]: I0320 07:19:57.984252 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rv5h9"] Mar 20 07:19:57 crc kubenswrapper[4749]: I0320 07:19:57.984517 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rv5h9" podUID="b7e5d15e-f3f5-4595-be01-ae4f196285ad" containerName="registry-server" containerID="cri-o://5b2744784a89060251e31f3410f211c4fdd6a528ed1e0fd2cb6969196eadcefb" gracePeriod=30 Mar 20 07:19:57 crc kubenswrapper[4749]: I0320 07:19:57.999180 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j79zb"] Mar 20 07:19:57 crc kubenswrapper[4749]: I0320 07:19:57.999420 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" podUID="00730545-e9b7-4166-9f09-7a6fcac8cad3" containerName="marketplace-operator" containerID="cri-o://8c2a06811a45e54524da4727153d331ebe96f784206d3bcc1fde7d8744eb21ff" gracePeriod=30 Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.016077 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l72fj"] Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.016400 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l72fj" podUID="9c486dab-86dd-44dd-8c82-4c07ed84aa50" containerName="registry-server" containerID="cri-o://39fe9be0f912e80fe35c038568c54aa7f8ca9de79b1998dbdfc3281167af819e" gracePeriod=30 Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.022625 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m7xc9"] Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.023649 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m7xc9" podUID="8faad596-00ed-4982-9f42-2f1a2465098c" containerName="registry-server" containerID="cri-o://7f3a1323d1b5bd0dc74dafb6010e8593316a78e287be8f743cad2b14377d198d" gracePeriod=30 Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.042220 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9lpwm"] Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.043189 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9lpwm" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.058963 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9lpwm"] Mar 20 07:19:58 crc kubenswrapper[4749]: E0320 07:19:58.094831 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 39fe9be0f912e80fe35c038568c54aa7f8ca9de79b1998dbdfc3281167af819e is running failed: container process not found" containerID="39fe9be0f912e80fe35c038568c54aa7f8ca9de79b1998dbdfc3281167af819e" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 07:19:58 crc kubenswrapper[4749]: E0320 07:19:58.095239 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 39fe9be0f912e80fe35c038568c54aa7f8ca9de79b1998dbdfc3281167af819e is running failed: container process not found" containerID="39fe9be0f912e80fe35c038568c54aa7f8ca9de79b1998dbdfc3281167af819e" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 07:19:58 crc kubenswrapper[4749]: E0320 07:19:58.095490 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 39fe9be0f912e80fe35c038568c54aa7f8ca9de79b1998dbdfc3281167af819e is running failed: container process not found" containerID="39fe9be0f912e80fe35c038568c54aa7f8ca9de79b1998dbdfc3281167af819e" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 07:19:58 crc kubenswrapper[4749]: E0320 07:19:58.095518 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 39fe9be0f912e80fe35c038568c54aa7f8ca9de79b1998dbdfc3281167af819e is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-l72fj" podUID="9c486dab-86dd-44dd-8c82-4c07ed84aa50" containerName="registry-server" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.107837 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ab07be1c-a7c8-4310-b2be-7dea01a4a55b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9lpwm\" (UID: \"ab07be1c-a7c8-4310-b2be-7dea01a4a55b\") " pod="openshift-marketplace/marketplace-operator-79b997595-9lpwm" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.107868 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgqkx\" (UniqueName: \"kubernetes.io/projected/ab07be1c-a7c8-4310-b2be-7dea01a4a55b-kube-api-access-sgqkx\") pod \"marketplace-operator-79b997595-9lpwm\" (UID: \"ab07be1c-a7c8-4310-b2be-7dea01a4a55b\") " pod="openshift-marketplace/marketplace-operator-79b997595-9lpwm" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.107956 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ab07be1c-a7c8-4310-b2be-7dea01a4a55b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9lpwm\" (UID: \"ab07be1c-a7c8-4310-b2be-7dea01a4a55b\") " pod="openshift-marketplace/marketplace-operator-79b997595-9lpwm" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.209476 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ab07be1c-a7c8-4310-b2be-7dea01a4a55b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9lpwm\" (UID: \"ab07be1c-a7c8-4310-b2be-7dea01a4a55b\") " pod="openshift-marketplace/marketplace-operator-79b997595-9lpwm" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.209548 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ab07be1c-a7c8-4310-b2be-7dea01a4a55b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9lpwm\" (UID: \"ab07be1c-a7c8-4310-b2be-7dea01a4a55b\") " pod="openshift-marketplace/marketplace-operator-79b997595-9lpwm" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.209565 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgqkx\" (UniqueName: \"kubernetes.io/projected/ab07be1c-a7c8-4310-b2be-7dea01a4a55b-kube-api-access-sgqkx\") pod \"marketplace-operator-79b997595-9lpwm\" (UID: \"ab07be1c-a7c8-4310-b2be-7dea01a4a55b\") " pod="openshift-marketplace/marketplace-operator-79b997595-9lpwm" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.212632 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ab07be1c-a7c8-4310-b2be-7dea01a4a55b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9lpwm\" (UID: \"ab07be1c-a7c8-4310-b2be-7dea01a4a55b\") " pod="openshift-marketplace/marketplace-operator-79b997595-9lpwm" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.220570 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ab07be1c-a7c8-4310-b2be-7dea01a4a55b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9lpwm\" (UID: \"ab07be1c-a7c8-4310-b2be-7dea01a4a55b\") " pod="openshift-marketplace/marketplace-operator-79b997595-9lpwm" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.227023 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgqkx\" (UniqueName: \"kubernetes.io/projected/ab07be1c-a7c8-4310-b2be-7dea01a4a55b-kube-api-access-sgqkx\") pod \"marketplace-operator-79b997595-9lpwm\" (UID: \"ab07be1c-a7c8-4310-b2be-7dea01a4a55b\") " pod="openshift-marketplace/marketplace-operator-79b997595-9lpwm" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.483824 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9lpwm" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.503338 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rv5h9" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.512533 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l72fj" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.553332 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.556474 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m7xc9" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.609709 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9ss5d" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.671570 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgp9z\" (UniqueName: \"kubernetes.io/projected/b7e5d15e-f3f5-4595-be01-ae4f196285ad-kube-api-access-fgp9z\") pod \"b7e5d15e-f3f5-4595-be01-ae4f196285ad\" (UID: \"b7e5d15e-f3f5-4595-be01-ae4f196285ad\") " Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.671619 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e5d15e-f3f5-4595-be01-ae4f196285ad-utilities\") pod \"b7e5d15e-f3f5-4595-be01-ae4f196285ad\" (UID: \"b7e5d15e-f3f5-4595-be01-ae4f196285ad\") " Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.671653 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/00730545-e9b7-4166-9f09-7a6fcac8cad3-marketplace-operator-metrics\") pod \"00730545-e9b7-4166-9f09-7a6fcac8cad3\" (UID: \"00730545-e9b7-4166-9f09-7a6fcac8cad3\") " Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.671684 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8faad596-00ed-4982-9f42-2f1a2465098c-utilities\") pod \"8faad596-00ed-4982-9f42-2f1a2465098c\" (UID: \"8faad596-00ed-4982-9f42-2f1a2465098c\") " Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.671710 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c486dab-86dd-44dd-8c82-4c07ed84aa50-utilities\") pod \"9c486dab-86dd-44dd-8c82-4c07ed84aa50\" (UID: \"9c486dab-86dd-44dd-8c82-4c07ed84aa50\") " Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.671740 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e5d15e-f3f5-4595-be01-ae4f196285ad-catalog-content\") pod \"b7e5d15e-f3f5-4595-be01-ae4f196285ad\" (UID: \"b7e5d15e-f3f5-4595-be01-ae4f196285ad\") " Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.671760 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9r2c\" (UniqueName: \"kubernetes.io/projected/9c486dab-86dd-44dd-8c82-4c07ed84aa50-kube-api-access-c9r2c\") pod \"9c486dab-86dd-44dd-8c82-4c07ed84aa50\" (UID: \"9c486dab-86dd-44dd-8c82-4c07ed84aa50\") " Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.671789 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnxph\" (UniqueName: \"kubernetes.io/projected/00730545-e9b7-4166-9f09-7a6fcac8cad3-kube-api-access-fnxph\") pod \"00730545-e9b7-4166-9f09-7a6fcac8cad3\" (UID: \"00730545-e9b7-4166-9f09-7a6fcac8cad3\") " Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.671813 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4zwd\" (UniqueName: \"kubernetes.io/projected/8faad596-00ed-4982-9f42-2f1a2465098c-kube-api-access-d4zwd\") pod \"8faad596-00ed-4982-9f42-2f1a2465098c\" (UID: \"8faad596-00ed-4982-9f42-2f1a2465098c\") " Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.671836 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c486dab-86dd-44dd-8c82-4c07ed84aa50-catalog-content\") pod \"9c486dab-86dd-44dd-8c82-4c07ed84aa50\" (UID: \"9c486dab-86dd-44dd-8c82-4c07ed84aa50\") " Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.671889 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00730545-e9b7-4166-9f09-7a6fcac8cad3-marketplace-trusted-ca\") pod \"00730545-e9b7-4166-9f09-7a6fcac8cad3\" (UID: \"00730545-e9b7-4166-9f09-7a6fcac8cad3\") " Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.671915 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8faad596-00ed-4982-9f42-2f1a2465098c-catalog-content\") pod \"8faad596-00ed-4982-9f42-2f1a2465098c\" (UID: \"8faad596-00ed-4982-9f42-2f1a2465098c\") " Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.678011 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00730545-e9b7-4166-9f09-7a6fcac8cad3-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "00730545-e9b7-4166-9f09-7a6fcac8cad3" (UID: "00730545-e9b7-4166-9f09-7a6fcac8cad3"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.678053 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8faad596-00ed-4982-9f42-2f1a2465098c-utilities" (OuterVolumeSpecName: "utilities") pod "8faad596-00ed-4982-9f42-2f1a2465098c" (UID: "8faad596-00ed-4982-9f42-2f1a2465098c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.678308 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7e5d15e-f3f5-4595-be01-ae4f196285ad-utilities" (OuterVolumeSpecName: "utilities") pod "b7e5d15e-f3f5-4595-be01-ae4f196285ad" (UID: "b7e5d15e-f3f5-4595-be01-ae4f196285ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.682719 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00730545-e9b7-4166-9f09-7a6fcac8cad3-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "00730545-e9b7-4166-9f09-7a6fcac8cad3" (UID: "00730545-e9b7-4166-9f09-7a6fcac8cad3"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.682923 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00730545-e9b7-4166-9f09-7a6fcac8cad3-kube-api-access-fnxph" (OuterVolumeSpecName: "kube-api-access-fnxph") pod "00730545-e9b7-4166-9f09-7a6fcac8cad3" (UID: "00730545-e9b7-4166-9f09-7a6fcac8cad3"). InnerVolumeSpecName "kube-api-access-fnxph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.683935 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c486dab-86dd-44dd-8c82-4c07ed84aa50-utilities" (OuterVolumeSpecName: "utilities") pod "9c486dab-86dd-44dd-8c82-4c07ed84aa50" (UID: "9c486dab-86dd-44dd-8c82-4c07ed84aa50"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.685024 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7e5d15e-f3f5-4595-be01-ae4f196285ad-kube-api-access-fgp9z" (OuterVolumeSpecName: "kube-api-access-fgp9z") pod "b7e5d15e-f3f5-4595-be01-ae4f196285ad" (UID: "b7e5d15e-f3f5-4595-be01-ae4f196285ad"). InnerVolumeSpecName "kube-api-access-fgp9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.687104 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c486dab-86dd-44dd-8c82-4c07ed84aa50-kube-api-access-c9r2c" (OuterVolumeSpecName: "kube-api-access-c9r2c") pod "9c486dab-86dd-44dd-8c82-4c07ed84aa50" (UID: "9c486dab-86dd-44dd-8c82-4c07ed84aa50"). InnerVolumeSpecName "kube-api-access-c9r2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.691463 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8faad596-00ed-4982-9f42-2f1a2465098c-kube-api-access-d4zwd" (OuterVolumeSpecName: "kube-api-access-d4zwd") pod "8faad596-00ed-4982-9f42-2f1a2465098c" (UID: "8faad596-00ed-4982-9f42-2f1a2465098c"). InnerVolumeSpecName "kube-api-access-d4zwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.700833 4749 generic.go:334] "Generic (PLEG): container finished" podID="a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8" containerID="96c3778c4c41d3e6222d522de25ac4d41a6d02827a8c73b25ec24e7347258c10" exitCode=0 Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.701089 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ss5d" event={"ID":"a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8","Type":"ContainerDied","Data":"96c3778c4c41d3e6222d522de25ac4d41a6d02827a8c73b25ec24e7347258c10"} Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.701124 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ss5d" event={"ID":"a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8","Type":"ContainerDied","Data":"e497ff3133bd24ea6fe59a5158b6eff67216c2c81bf98ac7d6188c109f4be5c2"} Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.701143 4749 scope.go:117] "RemoveContainer" containerID="96c3778c4c41d3e6222d522de25ac4d41a6d02827a8c73b25ec24e7347258c10" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.701274 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9ss5d" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.707240 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c486dab-86dd-44dd-8c82-4c07ed84aa50-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c486dab-86dd-44dd-8c82-4c07ed84aa50" (UID: "9c486dab-86dd-44dd-8c82-4c07ed84aa50"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.707331 4749 generic.go:334] "Generic (PLEG): container finished" podID="b7e5d15e-f3f5-4595-be01-ae4f196285ad" containerID="5b2744784a89060251e31f3410f211c4fdd6a528ed1e0fd2cb6969196eadcefb" exitCode=0 Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.707352 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rv5h9" event={"ID":"b7e5d15e-f3f5-4595-be01-ae4f196285ad","Type":"ContainerDied","Data":"5b2744784a89060251e31f3410f211c4fdd6a528ed1e0fd2cb6969196eadcefb"} Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.707575 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rv5h9" event={"ID":"b7e5d15e-f3f5-4595-be01-ae4f196285ad","Type":"ContainerDied","Data":"0cdc74c769040e7772b558188d153b2ac3a0fb715d4a1f99d22c13ca1b5d0be3"} Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.707424 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rv5h9" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.713668 4749 generic.go:334] "Generic (PLEG): container finished" podID="9c486dab-86dd-44dd-8c82-4c07ed84aa50" containerID="39fe9be0f912e80fe35c038568c54aa7f8ca9de79b1998dbdfc3281167af819e" exitCode=0 Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.713734 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l72fj" event={"ID":"9c486dab-86dd-44dd-8c82-4c07ed84aa50","Type":"ContainerDied","Data":"39fe9be0f912e80fe35c038568c54aa7f8ca9de79b1998dbdfc3281167af819e"} Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.713763 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l72fj" event={"ID":"9c486dab-86dd-44dd-8c82-4c07ed84aa50","Type":"ContainerDied","Data":"8a082d7c577197e0e3943ef9dd3a66d7f42bd030c0e2c94bb748625bcd7a5460"} Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.713839 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l72fj" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.719629 4749 scope.go:117] "RemoveContainer" containerID="b76ccba3f0a5a2989140618b2be44e4edb8fc5245e111df682092ac05537917c" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.721399 4749 generic.go:334] "Generic (PLEG): container finished" podID="8faad596-00ed-4982-9f42-2f1a2465098c" containerID="7f3a1323d1b5bd0dc74dafb6010e8593316a78e287be8f743cad2b14377d198d" exitCode=0 Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.721454 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7xc9" event={"ID":"8faad596-00ed-4982-9f42-2f1a2465098c","Type":"ContainerDied","Data":"7f3a1323d1b5bd0dc74dafb6010e8593316a78e287be8f743cad2b14377d198d"} Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.721475 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7xc9" event={"ID":"8faad596-00ed-4982-9f42-2f1a2465098c","Type":"ContainerDied","Data":"4e75a4e346c83f8b735eff87be26225ab4e08f99ad40440332375250e08fc9a6"} Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.721543 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m7xc9" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.725261 4749 generic.go:334] "Generic (PLEG): container finished" podID="00730545-e9b7-4166-9f09-7a6fcac8cad3" containerID="8c2a06811a45e54524da4727153d331ebe96f784206d3bcc1fde7d8744eb21ff" exitCode=0 Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.725298 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" event={"ID":"00730545-e9b7-4166-9f09-7a6fcac8cad3","Type":"ContainerDied","Data":"8c2a06811a45e54524da4727153d331ebe96f784206d3bcc1fde7d8744eb21ff"} Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.725313 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" event={"ID":"00730545-e9b7-4166-9f09-7a6fcac8cad3","Type":"ContainerDied","Data":"96c0d11937cd1764d96d49906a193ace9af75d78ad745e96703b8178636c1e54"} Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.725349 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j79zb" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.734329 4749 scope.go:117] "RemoveContainer" containerID="14d34f2b700101fc81e29ffcc36f01ece29b230336a477e58c063cc0be27c1ba" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.746413 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l72fj"] Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.772787 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8-utilities\") pod \"a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8\" (UID: \"a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8\") " Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.772937 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d65vz\" (UniqueName: \"kubernetes.io/projected/a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8-kube-api-access-d65vz\") pod \"a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8\" (UID: \"a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8\") " Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.772997 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8-catalog-content\") pod \"a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8\" (UID: \"a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8\") " Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.773269 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00730545-e9b7-4166-9f09-7a6fcac8cad3-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.773307 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgp9z\" (UniqueName: \"kubernetes.io/projected/b7e5d15e-f3f5-4595-be01-ae4f196285ad-kube-api-access-fgp9z\") on node \"crc\" DevicePath \"\"" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.773322 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e5d15e-f3f5-4595-be01-ae4f196285ad-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.773334 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/00730545-e9b7-4166-9f09-7a6fcac8cad3-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.773347 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8faad596-00ed-4982-9f42-2f1a2465098c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.773693 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c486dab-86dd-44dd-8c82-4c07ed84aa50-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.773728 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9r2c\" (UniqueName: \"kubernetes.io/projected/9c486dab-86dd-44dd-8c82-4c07ed84aa50-kube-api-access-c9r2c\") on node \"crc\" DevicePath \"\"" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.773740 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnxph\" (UniqueName: \"kubernetes.io/projected/00730545-e9b7-4166-9f09-7a6fcac8cad3-kube-api-access-fnxph\") on node \"crc\" DevicePath \"\"" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.773750 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4zwd\" (UniqueName: \"kubernetes.io/projected/8faad596-00ed-4982-9f42-2f1a2465098c-kube-api-access-d4zwd\") on node \"crc\" DevicePath \"\"" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.773762 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c486dab-86dd-44dd-8c82-4c07ed84aa50-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.774153 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8-utilities" (OuterVolumeSpecName: "utilities") pod "a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8" (UID: "a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.775297 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l72fj"] Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.776115 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8-kube-api-access-d65vz" (OuterVolumeSpecName: "kube-api-access-d65vz") pod "a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8" (UID: "a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8"). InnerVolumeSpecName "kube-api-access-d65vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.776568 4749 scope.go:117] "RemoveContainer" containerID="96c3778c4c41d3e6222d522de25ac4d41a6d02827a8c73b25ec24e7347258c10" Mar 20 07:19:58 crc kubenswrapper[4749]: E0320 07:19:58.777949 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96c3778c4c41d3e6222d522de25ac4d41a6d02827a8c73b25ec24e7347258c10\": container with ID starting with 96c3778c4c41d3e6222d522de25ac4d41a6d02827a8c73b25ec24e7347258c10 not found: ID does not exist" containerID="96c3778c4c41d3e6222d522de25ac4d41a6d02827a8c73b25ec24e7347258c10" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.777985 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96c3778c4c41d3e6222d522de25ac4d41a6d02827a8c73b25ec24e7347258c10"} err="failed to get container status \"96c3778c4c41d3e6222d522de25ac4d41a6d02827a8c73b25ec24e7347258c10\": rpc error: code = NotFound desc = could not find container \"96c3778c4c41d3e6222d522de25ac4d41a6d02827a8c73b25ec24e7347258c10\": container with ID starting with 96c3778c4c41d3e6222d522de25ac4d41a6d02827a8c73b25ec24e7347258c10 not found: ID does not exist" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.778024 4749 scope.go:117] "RemoveContainer" containerID="b76ccba3f0a5a2989140618b2be44e4edb8fc5245e111df682092ac05537917c" Mar 20 07:19:58 crc kubenswrapper[4749]: E0320 07:19:58.778375 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b76ccba3f0a5a2989140618b2be44e4edb8fc5245e111df682092ac05537917c\": container with ID starting with b76ccba3f0a5a2989140618b2be44e4edb8fc5245e111df682092ac05537917c not found: ID does not exist" containerID="b76ccba3f0a5a2989140618b2be44e4edb8fc5245e111df682092ac05537917c" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.778405 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b76ccba3f0a5a2989140618b2be44e4edb8fc5245e111df682092ac05537917c"} err="failed to get container status \"b76ccba3f0a5a2989140618b2be44e4edb8fc5245e111df682092ac05537917c\": rpc error: code = NotFound desc = could not find container \"b76ccba3f0a5a2989140618b2be44e4edb8fc5245e111df682092ac05537917c\": container with ID starting with b76ccba3f0a5a2989140618b2be44e4edb8fc5245e111df682092ac05537917c not found: ID does not exist" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.778422 4749 scope.go:117] "RemoveContainer" containerID="14d34f2b700101fc81e29ffcc36f01ece29b230336a477e58c063cc0be27c1ba" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.779141 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j79zb"] Mar 20 07:19:58 crc kubenswrapper[4749]: E0320 07:19:58.780023 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14d34f2b700101fc81e29ffcc36f01ece29b230336a477e58c063cc0be27c1ba\": container with ID starting with 14d34f2b700101fc81e29ffcc36f01ece29b230336a477e58c063cc0be27c1ba not found: ID does not exist" containerID="14d34f2b700101fc81e29ffcc36f01ece29b230336a477e58c063cc0be27c1ba" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.780040 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14d34f2b700101fc81e29ffcc36f01ece29b230336a477e58c063cc0be27c1ba"} err="failed to get container status \"14d34f2b700101fc81e29ffcc36f01ece29b230336a477e58c063cc0be27c1ba\": rpc error: code = NotFound desc = could not find container \"14d34f2b700101fc81e29ffcc36f01ece29b230336a477e58c063cc0be27c1ba\": container with ID starting with 14d34f2b700101fc81e29ffcc36f01ece29b230336a477e58c063cc0be27c1ba not found: ID does not exist" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.780089 4749 scope.go:117] "RemoveContainer" containerID="5b2744784a89060251e31f3410f211c4fdd6a528ed1e0fd2cb6969196eadcefb" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.783078 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j79zb"] Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.793276 4749 scope.go:117] "RemoveContainer" containerID="cec62a01f58dc1bb798827b623adb1ec9658643bbf726980bae60126c0e39af0" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.810802 4749 scope.go:117] "RemoveContainer" containerID="f874c7a512f959ae6157a0fad44f63e0445599ccb8af9ce663a82dd65f470823" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.815820 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7e5d15e-f3f5-4595-be01-ae4f196285ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7e5d15e-f3f5-4595-be01-ae4f196285ad" (UID: "b7e5d15e-f3f5-4595-be01-ae4f196285ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:19:58 crc kubenswrapper[4749]: E0320 07:19:58.816374 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00730545_e9b7_4166_9f09_7a6fcac8cad3.slice/crio-96c0d11937cd1764d96d49906a193ace9af75d78ad745e96703b8178636c1e54\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c486dab_86dd_44dd_8c82_4c07ed84aa50.slice/crio-8a082d7c577197e0e3943ef9dd3a66d7f42bd030c0e2c94bb748625bcd7a5460\": RecentStats: unable to find data in memory cache]" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.827091 4749 scope.go:117] "RemoveContainer" containerID="5b2744784a89060251e31f3410f211c4fdd6a528ed1e0fd2cb6969196eadcefb" Mar 20 07:19:58 crc kubenswrapper[4749]: E0320 07:19:58.827461 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b2744784a89060251e31f3410f211c4fdd6a528ed1e0fd2cb6969196eadcefb\": container with ID starting with 5b2744784a89060251e31f3410f211c4fdd6a528ed1e0fd2cb6969196eadcefb not found: ID does not exist" containerID="5b2744784a89060251e31f3410f211c4fdd6a528ed1e0fd2cb6969196eadcefb" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.827487 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b2744784a89060251e31f3410f211c4fdd6a528ed1e0fd2cb6969196eadcefb"} err="failed to get container status \"5b2744784a89060251e31f3410f211c4fdd6a528ed1e0fd2cb6969196eadcefb\": rpc error: code = NotFound desc = could not find container \"5b2744784a89060251e31f3410f211c4fdd6a528ed1e0fd2cb6969196eadcefb\": container with ID starting with 5b2744784a89060251e31f3410f211c4fdd6a528ed1e0fd2cb6969196eadcefb not found: ID does not exist" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.827507 4749 scope.go:117] "RemoveContainer" containerID="cec62a01f58dc1bb798827b623adb1ec9658643bbf726980bae60126c0e39af0" Mar 20 07:19:58 crc kubenswrapper[4749]: E0320 07:19:58.827886 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cec62a01f58dc1bb798827b623adb1ec9658643bbf726980bae60126c0e39af0\": container with ID starting with cec62a01f58dc1bb798827b623adb1ec9658643bbf726980bae60126c0e39af0 not found: ID does not exist" containerID="cec62a01f58dc1bb798827b623adb1ec9658643bbf726980bae60126c0e39af0" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.827958 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cec62a01f58dc1bb798827b623adb1ec9658643bbf726980bae60126c0e39af0"} err="failed to get container status \"cec62a01f58dc1bb798827b623adb1ec9658643bbf726980bae60126c0e39af0\": rpc error: code = NotFound desc = could not find container \"cec62a01f58dc1bb798827b623adb1ec9658643bbf726980bae60126c0e39af0\": container with ID starting with cec62a01f58dc1bb798827b623adb1ec9658643bbf726980bae60126c0e39af0 not found: ID does not exist" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.827989 4749 scope.go:117] "RemoveContainer" containerID="f874c7a512f959ae6157a0fad44f63e0445599ccb8af9ce663a82dd65f470823" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.828272 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8" (UID: "a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:19:58 crc kubenswrapper[4749]: E0320 07:19:58.828437 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f874c7a512f959ae6157a0fad44f63e0445599ccb8af9ce663a82dd65f470823\": container with ID starting with f874c7a512f959ae6157a0fad44f63e0445599ccb8af9ce663a82dd65f470823 not found: ID does not exist" containerID="f874c7a512f959ae6157a0fad44f63e0445599ccb8af9ce663a82dd65f470823" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.828470 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f874c7a512f959ae6157a0fad44f63e0445599ccb8af9ce663a82dd65f470823"} err="failed to get container status \"f874c7a512f959ae6157a0fad44f63e0445599ccb8af9ce663a82dd65f470823\": rpc error: code = NotFound desc = could not find container \"f874c7a512f959ae6157a0fad44f63e0445599ccb8af9ce663a82dd65f470823\": container with ID starting with f874c7a512f959ae6157a0fad44f63e0445599ccb8af9ce663a82dd65f470823 not found: ID does not exist" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.828489 4749 scope.go:117] "RemoveContainer" containerID="39fe9be0f912e80fe35c038568c54aa7f8ca9de79b1998dbdfc3281167af819e" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.841700 4749 scope.go:117] "RemoveContainer" containerID="605181e8839a7bab36bf33578fe78f629ef89f82b77f709f1ed2a8398683a2d0" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.856368 4749 scope.go:117] "RemoveContainer" containerID="e277e7269daa7a860daf228a1a39f92b4b140989bc9efc5568d849d5d3baa18b" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.873493 4749 scope.go:117] "RemoveContainer" containerID="39fe9be0f912e80fe35c038568c54aa7f8ca9de79b1998dbdfc3281167af819e" Mar 20 07:19:58 crc kubenswrapper[4749]: E0320 07:19:58.873918 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39fe9be0f912e80fe35c038568c54aa7f8ca9de79b1998dbdfc3281167af819e\": container with ID starting with 39fe9be0f912e80fe35c038568c54aa7f8ca9de79b1998dbdfc3281167af819e not found: ID does not exist" containerID="39fe9be0f912e80fe35c038568c54aa7f8ca9de79b1998dbdfc3281167af819e" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.873992 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39fe9be0f912e80fe35c038568c54aa7f8ca9de79b1998dbdfc3281167af819e"} err="failed to get container status \"39fe9be0f912e80fe35c038568c54aa7f8ca9de79b1998dbdfc3281167af819e\": rpc error: code = NotFound desc = could not find container \"39fe9be0f912e80fe35c038568c54aa7f8ca9de79b1998dbdfc3281167af819e\": container with ID starting with 39fe9be0f912e80fe35c038568c54aa7f8ca9de79b1998dbdfc3281167af819e not found: ID does not exist" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.874025 4749 scope.go:117] "RemoveContainer" containerID="605181e8839a7bab36bf33578fe78f629ef89f82b77f709f1ed2a8398683a2d0" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.874485 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d65vz\" (UniqueName: \"kubernetes.io/projected/a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8-kube-api-access-d65vz\") on node \"crc\" DevicePath \"\"" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.874520 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e5d15e-f3f5-4595-be01-ae4f196285ad-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.874529 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.874539 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:19:58 crc kubenswrapper[4749]: E0320 07:19:58.874639 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"605181e8839a7bab36bf33578fe78f629ef89f82b77f709f1ed2a8398683a2d0\": container with ID starting with 605181e8839a7bab36bf33578fe78f629ef89f82b77f709f1ed2a8398683a2d0 not found: ID does not exist" containerID="605181e8839a7bab36bf33578fe78f629ef89f82b77f709f1ed2a8398683a2d0" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.874668 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"605181e8839a7bab36bf33578fe78f629ef89f82b77f709f1ed2a8398683a2d0"} err="failed to get container status \"605181e8839a7bab36bf33578fe78f629ef89f82b77f709f1ed2a8398683a2d0\": rpc error: code = NotFound desc = could not find container \"605181e8839a7bab36bf33578fe78f629ef89f82b77f709f1ed2a8398683a2d0\": container with ID starting with 605181e8839a7bab36bf33578fe78f629ef89f82b77f709f1ed2a8398683a2d0 not found: ID does not exist" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.874695 4749 scope.go:117] "RemoveContainer" containerID="e277e7269daa7a860daf228a1a39f92b4b140989bc9efc5568d849d5d3baa18b" Mar 20 07:19:58 crc kubenswrapper[4749]: E0320 07:19:58.874958 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e277e7269daa7a860daf228a1a39f92b4b140989bc9efc5568d849d5d3baa18b\": container with ID starting with e277e7269daa7a860daf228a1a39f92b4b140989bc9efc5568d849d5d3baa18b not found: ID does not exist" containerID="e277e7269daa7a860daf228a1a39f92b4b140989bc9efc5568d849d5d3baa18b" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.874996 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e277e7269daa7a860daf228a1a39f92b4b140989bc9efc5568d849d5d3baa18b"} err="failed to get container status \"e277e7269daa7a860daf228a1a39f92b4b140989bc9efc5568d849d5d3baa18b\": rpc error: code = NotFound desc = could not find container \"e277e7269daa7a860daf228a1a39f92b4b140989bc9efc5568d849d5d3baa18b\": container with ID starting with e277e7269daa7a860daf228a1a39f92b4b140989bc9efc5568d849d5d3baa18b not found: ID does not exist" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.875026 4749 scope.go:117] "RemoveContainer" containerID="7f3a1323d1b5bd0dc74dafb6010e8593316a78e287be8f743cad2b14377d198d" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.888644 4749 scope.go:117] "RemoveContainer" containerID="0d5ae3d79e06d7453fd6ee561c7382bb3d872ad722d8e4a54638397bec96701c" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.894698 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8faad596-00ed-4982-9f42-2f1a2465098c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8faad596-00ed-4982-9f42-2f1a2465098c" (UID: "8faad596-00ed-4982-9f42-2f1a2465098c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.903253 4749 scope.go:117] "RemoveContainer" containerID="a187deff2bddc293f5f4627656514e6ed7fb308965e742a6519be5199f2210ee" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.915918 4749 scope.go:117] "RemoveContainer" containerID="7f3a1323d1b5bd0dc74dafb6010e8593316a78e287be8f743cad2b14377d198d" Mar 20 07:19:58 crc kubenswrapper[4749]: E0320 07:19:58.916174 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f3a1323d1b5bd0dc74dafb6010e8593316a78e287be8f743cad2b14377d198d\": container with ID starting with 7f3a1323d1b5bd0dc74dafb6010e8593316a78e287be8f743cad2b14377d198d not found: ID does not exist" containerID="7f3a1323d1b5bd0dc74dafb6010e8593316a78e287be8f743cad2b14377d198d" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.916219 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f3a1323d1b5bd0dc74dafb6010e8593316a78e287be8f743cad2b14377d198d"} err="failed to get container status \"7f3a1323d1b5bd0dc74dafb6010e8593316a78e287be8f743cad2b14377d198d\": rpc error: code = NotFound desc = could not find container \"7f3a1323d1b5bd0dc74dafb6010e8593316a78e287be8f743cad2b14377d198d\": container with ID starting with 7f3a1323d1b5bd0dc74dafb6010e8593316a78e287be8f743cad2b14377d198d not found: ID does not exist" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.916250 4749 scope.go:117] "RemoveContainer" containerID="0d5ae3d79e06d7453fd6ee561c7382bb3d872ad722d8e4a54638397bec96701c" Mar 20 07:19:58 crc kubenswrapper[4749]: E0320 07:19:58.916773 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d5ae3d79e06d7453fd6ee561c7382bb3d872ad722d8e4a54638397bec96701c\": container with ID starting with 0d5ae3d79e06d7453fd6ee561c7382bb3d872ad722d8e4a54638397bec96701c not found: ID does not exist" containerID="0d5ae3d79e06d7453fd6ee561c7382bb3d872ad722d8e4a54638397bec96701c" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.916812 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d5ae3d79e06d7453fd6ee561c7382bb3d872ad722d8e4a54638397bec96701c"} err="failed to get container status \"0d5ae3d79e06d7453fd6ee561c7382bb3d872ad722d8e4a54638397bec96701c\": rpc error: code = NotFound desc = could not find container \"0d5ae3d79e06d7453fd6ee561c7382bb3d872ad722d8e4a54638397bec96701c\": container with ID starting with 0d5ae3d79e06d7453fd6ee561c7382bb3d872ad722d8e4a54638397bec96701c not found: ID does not exist" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.916842 4749 scope.go:117] "RemoveContainer" containerID="a187deff2bddc293f5f4627656514e6ed7fb308965e742a6519be5199f2210ee" Mar 20 07:19:58 crc kubenswrapper[4749]: E0320 07:19:58.917051 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a187deff2bddc293f5f4627656514e6ed7fb308965e742a6519be5199f2210ee\": container with ID starting with a187deff2bddc293f5f4627656514e6ed7fb308965e742a6519be5199f2210ee not found: ID does not exist" containerID="a187deff2bddc293f5f4627656514e6ed7fb308965e742a6519be5199f2210ee" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.917082 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a187deff2bddc293f5f4627656514e6ed7fb308965e742a6519be5199f2210ee"} err="failed to get container status \"a187deff2bddc293f5f4627656514e6ed7fb308965e742a6519be5199f2210ee\": rpc error: code = NotFound desc = could not find container \"a187deff2bddc293f5f4627656514e6ed7fb308965e742a6519be5199f2210ee\": container with ID starting with a187deff2bddc293f5f4627656514e6ed7fb308965e742a6519be5199f2210ee not found: ID does not exist" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.917099 4749 scope.go:117] "RemoveContainer" containerID="8c2a06811a45e54524da4727153d331ebe96f784206d3bcc1fde7d8744eb21ff" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.932339 4749 scope.go:117] "RemoveContainer" containerID="67cb77d5c367914c438c14e7a50a60f7b25514b9f084346fde48e6b9dcbfb6c7" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.949311 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9lpwm"] Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.954963 4749 scope.go:117] "RemoveContainer" containerID="8c2a06811a45e54524da4727153d331ebe96f784206d3bcc1fde7d8744eb21ff" Mar 20 07:19:58 crc kubenswrapper[4749]: E0320 07:19:58.955407 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c2a06811a45e54524da4727153d331ebe96f784206d3bcc1fde7d8744eb21ff\": container with ID starting with 8c2a06811a45e54524da4727153d331ebe96f784206d3bcc1fde7d8744eb21ff not found: ID does not exist" containerID="8c2a06811a45e54524da4727153d331ebe96f784206d3bcc1fde7d8744eb21ff" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.955442 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c2a06811a45e54524da4727153d331ebe96f784206d3bcc1fde7d8744eb21ff"} err="failed to get container status \"8c2a06811a45e54524da4727153d331ebe96f784206d3bcc1fde7d8744eb21ff\": rpc error: code = NotFound desc = could not find container \"8c2a06811a45e54524da4727153d331ebe96f784206d3bcc1fde7d8744eb21ff\": container with ID starting with 8c2a06811a45e54524da4727153d331ebe96f784206d3bcc1fde7d8744eb21ff not found: ID does not exist" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.955467 4749 scope.go:117] "RemoveContainer" containerID="67cb77d5c367914c438c14e7a50a60f7b25514b9f084346fde48e6b9dcbfb6c7" Mar 20 07:19:58 crc kubenswrapper[4749]: E0320 07:19:58.955996 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67cb77d5c367914c438c14e7a50a60f7b25514b9f084346fde48e6b9dcbfb6c7\": container with ID starting with 67cb77d5c367914c438c14e7a50a60f7b25514b9f084346fde48e6b9dcbfb6c7 not found: ID does not exist" containerID="67cb77d5c367914c438c14e7a50a60f7b25514b9f084346fde48e6b9dcbfb6c7" Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.956046 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67cb77d5c367914c438c14e7a50a60f7b25514b9f084346fde48e6b9dcbfb6c7"} err="failed to get container status \"67cb77d5c367914c438c14e7a50a60f7b25514b9f084346fde48e6b9dcbfb6c7\": rpc error: code = NotFound desc = could not find container \"67cb77d5c367914c438c14e7a50a60f7b25514b9f084346fde48e6b9dcbfb6c7\": container with ID starting with 67cb77d5c367914c438c14e7a50a60f7b25514b9f084346fde48e6b9dcbfb6c7 not found: ID does not exist" Mar 20 07:19:58 crc kubenswrapper[4749]: W0320 07:19:58.962827 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab07be1c_a7c8_4310_b2be_7dea01a4a55b.slice/crio-f3d567929d695f5348a34545401b1129232d58a004a1dad4274c88157a5868f8 WatchSource:0}: Error finding container f3d567929d695f5348a34545401b1129232d58a004a1dad4274c88157a5868f8: Status 404 returned error can't find the container with id f3d567929d695f5348a34545401b1129232d58a004a1dad4274c88157a5868f8 Mar 20 07:19:58 crc kubenswrapper[4749]: I0320 07:19:58.975359 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8faad596-00ed-4982-9f42-2f1a2465098c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:19:59 crc kubenswrapper[4749]: I0320 07:19:59.059083 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9ss5d"] Mar 20 07:19:59 crc kubenswrapper[4749]: I0320 07:19:59.079719 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9ss5d"] Mar 20 07:19:59 crc kubenswrapper[4749]: I0320 07:19:59.086403 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rv5h9"] Mar 20 07:19:59 crc kubenswrapper[4749]: I0320 07:19:59.094783 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rv5h9"] Mar 20 07:19:59 crc kubenswrapper[4749]: I0320 07:19:59.097515 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m7xc9"] Mar 20 07:19:59 crc kubenswrapper[4749]: I0320 07:19:59.101586 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m7xc9"] Mar 20 07:19:59 crc kubenswrapper[4749]: I0320 07:19:59.747061 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9lpwm" event={"ID":"ab07be1c-a7c8-4310-b2be-7dea01a4a55b","Type":"ContainerStarted","Data":"944af8bd5b024d289c61095556db6e5aeb17323e78bfb215b36b7c0741770f64"} Mar 20 07:19:59 crc kubenswrapper[4749]: I0320 07:19:59.747109 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9lpwm" event={"ID":"ab07be1c-a7c8-4310-b2be-7dea01a4a55b","Type":"ContainerStarted","Data":"f3d567929d695f5348a34545401b1129232d58a004a1dad4274c88157a5868f8"} Mar 20 07:19:59 crc kubenswrapper[4749]: I0320 07:19:59.748204 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-9lpwm" Mar 20 07:19:59 crc kubenswrapper[4749]: I0320 07:19:59.751896 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-9lpwm" Mar 20 07:19:59 crc kubenswrapper[4749]: I0320 07:19:59.764337 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-9lpwm" podStartSLOduration=1.764321356 podStartE2EDuration="1.764321356s" podCreationTimestamp="2026-03-20 07:19:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:19:59.760983766 +0000 UTC m=+436.310641413" watchObservedRunningTime="2026-03-20 07:19:59.764321356 +0000 UTC m=+436.313979013" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.122811 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566520-j7gn9"] Mar 20 07:20:00 crc kubenswrapper[4749]: E0320 07:20:00.123014 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00730545-e9b7-4166-9f09-7a6fcac8cad3" containerName="marketplace-operator" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.123025 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="00730545-e9b7-4166-9f09-7a6fcac8cad3" containerName="marketplace-operator" Mar 20 07:20:00 crc kubenswrapper[4749]: E0320 07:20:00.123034 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e5d15e-f3f5-4595-be01-ae4f196285ad" containerName="extract-utilities" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.123039 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e5d15e-f3f5-4595-be01-ae4f196285ad" containerName="extract-utilities" Mar 20 07:20:00 crc kubenswrapper[4749]: E0320 07:20:00.123051 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c486dab-86dd-44dd-8c82-4c07ed84aa50" containerName="extract-utilities" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.123057 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c486dab-86dd-44dd-8c82-4c07ed84aa50" containerName="extract-utilities" Mar 20 07:20:00 crc kubenswrapper[4749]: E0320 07:20:00.123062 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8" containerName="registry-server" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.123068 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8" containerName="registry-server" Mar 20 07:20:00 crc kubenswrapper[4749]: E0320 07:20:00.123075 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8" containerName="extract-content" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.123081 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8" containerName="extract-content" Mar 20 07:20:00 crc kubenswrapper[4749]: E0320 07:20:00.123090 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c486dab-86dd-44dd-8c82-4c07ed84aa50" containerName="extract-content" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.123097 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c486dab-86dd-44dd-8c82-4c07ed84aa50" containerName="extract-content" Mar 20 07:20:00 crc kubenswrapper[4749]: E0320 07:20:00.123104 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e5d15e-f3f5-4595-be01-ae4f196285ad" containerName="extract-content" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.123110 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e5d15e-f3f5-4595-be01-ae4f196285ad" containerName="extract-content" Mar 20 07:20:00 crc kubenswrapper[4749]: E0320 07:20:00.123119 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c486dab-86dd-44dd-8c82-4c07ed84aa50" containerName="registry-server" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.123125 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c486dab-86dd-44dd-8c82-4c07ed84aa50" containerName="registry-server" Mar 20 07:20:00 crc kubenswrapper[4749]: E0320 07:20:00.123133 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e5d15e-f3f5-4595-be01-ae4f196285ad" containerName="registry-server" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.123138 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e5d15e-f3f5-4595-be01-ae4f196285ad" containerName="registry-server" Mar 20 07:20:00 crc kubenswrapper[4749]: E0320 07:20:00.123144 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8faad596-00ed-4982-9f42-2f1a2465098c" containerName="extract-content" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.123149 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8faad596-00ed-4982-9f42-2f1a2465098c" containerName="extract-content" Mar 20 07:20:00 crc kubenswrapper[4749]: E0320 07:20:00.123157 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8faad596-00ed-4982-9f42-2f1a2465098c" containerName="registry-server" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.123162 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8faad596-00ed-4982-9f42-2f1a2465098c" containerName="registry-server" Mar 20 07:20:00 crc kubenswrapper[4749]: E0320 07:20:00.123173 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8faad596-00ed-4982-9f42-2f1a2465098c" containerName="extract-utilities" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.123179 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8faad596-00ed-4982-9f42-2f1a2465098c" containerName="extract-utilities" Mar 20 07:20:00 crc kubenswrapper[4749]: E0320 07:20:00.123192 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8" containerName="extract-utilities" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.123197 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8" containerName="extract-utilities" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.123301 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="00730545-e9b7-4166-9f09-7a6fcac8cad3" containerName="marketplace-operator" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.123314 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8faad596-00ed-4982-9f42-2f1a2465098c" containerName="registry-server" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.123321 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8" containerName="registry-server" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.123330 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c486dab-86dd-44dd-8c82-4c07ed84aa50" containerName="registry-server" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.123339 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e5d15e-f3f5-4595-be01-ae4f196285ad" containerName="registry-server" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.123663 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566520-j7gn9" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.125341 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.125429 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.125571 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.131249 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566520-j7gn9"] Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.185190 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00730545-e9b7-4166-9f09-7a6fcac8cad3" path="/var/lib/kubelet/pods/00730545-e9b7-4166-9f09-7a6fcac8cad3/volumes" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.186090 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8faad596-00ed-4982-9f42-2f1a2465098c" path="/var/lib/kubelet/pods/8faad596-00ed-4982-9f42-2f1a2465098c/volumes" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.187015 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c486dab-86dd-44dd-8c82-4c07ed84aa50" path="/var/lib/kubelet/pods/9c486dab-86dd-44dd-8c82-4c07ed84aa50/volumes" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.188342 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8" path="/var/lib/kubelet/pods/a5ea8b8a-6944-4aeb-ae5e-62e6b14dc9f8/volumes" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.190247 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7e5d15e-f3f5-4595-be01-ae4f196285ad" path="/var/lib/kubelet/pods/b7e5d15e-f3f5-4595-be01-ae4f196285ad/volumes" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.293767 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8gc5\" (UniqueName: \"kubernetes.io/projected/2e201ab3-7a56-4786-867f-5beef3df85b8-kube-api-access-d8gc5\") pod \"auto-csr-approver-29566520-j7gn9\" (UID: \"2e201ab3-7a56-4786-867f-5beef3df85b8\") " pod="openshift-infra/auto-csr-approver-29566520-j7gn9" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.395388 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8gc5\" (UniqueName: \"kubernetes.io/projected/2e201ab3-7a56-4786-867f-5beef3df85b8-kube-api-access-d8gc5\") pod \"auto-csr-approver-29566520-j7gn9\" (UID: \"2e201ab3-7a56-4786-867f-5beef3df85b8\") " pod="openshift-infra/auto-csr-approver-29566520-j7gn9" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.413705 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8gc5\" (UniqueName: \"kubernetes.io/projected/2e201ab3-7a56-4786-867f-5beef3df85b8-kube-api-access-d8gc5\") pod \"auto-csr-approver-29566520-j7gn9\" (UID: \"2e201ab3-7a56-4786-867f-5beef3df85b8\") " pod="openshift-infra/auto-csr-approver-29566520-j7gn9" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.442430 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566520-j7gn9" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.620514 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dn6gk"] Mar 20 07:20:00 crc kubenswrapper[4749]: E0320 07:20:00.620738 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00730545-e9b7-4166-9f09-7a6fcac8cad3" containerName="marketplace-operator" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.620748 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="00730545-e9b7-4166-9f09-7a6fcac8cad3" containerName="marketplace-operator" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.620842 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="00730545-e9b7-4166-9f09-7a6fcac8cad3" containerName="marketplace-operator" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.621543 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dn6gk" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.636596 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dn6gk"] Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.638617 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.732884 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566520-j7gn9"] Mar 20 07:20:00 crc kubenswrapper[4749]: W0320 07:20:00.739675 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e201ab3_7a56_4786_867f_5beef3df85b8.slice/crio-8fdd15c6c0c54d0e7256c668273b539d4981daf07ab9aee62c4f161254b7ae5a WatchSource:0}: Error finding container 8fdd15c6c0c54d0e7256c668273b539d4981daf07ab9aee62c4f161254b7ae5a: Status 404 returned error can't find the container with id 8fdd15c6c0c54d0e7256c668273b539d4981daf07ab9aee62c4f161254b7ae5a Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.758241 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566520-j7gn9" event={"ID":"2e201ab3-7a56-4786-867f-5beef3df85b8","Type":"ContainerStarted","Data":"8fdd15c6c0c54d0e7256c668273b539d4981daf07ab9aee62c4f161254b7ae5a"} Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.800892 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86bd4a9-6a7b-432d-824c-e03199a458f6-catalog-content\") pod \"certified-operators-dn6gk\" (UID: \"e86bd4a9-6a7b-432d-824c-e03199a458f6\") " pod="openshift-marketplace/certified-operators-dn6gk" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.801060 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ljf8\" (UniqueName: \"kubernetes.io/projected/e86bd4a9-6a7b-432d-824c-e03199a458f6-kube-api-access-9ljf8\") pod \"certified-operators-dn6gk\" (UID: \"e86bd4a9-6a7b-432d-824c-e03199a458f6\") " pod="openshift-marketplace/certified-operators-dn6gk" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.801270 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86bd4a9-6a7b-432d-824c-e03199a458f6-utilities\") pod \"certified-operators-dn6gk\" (UID: \"e86bd4a9-6a7b-432d-824c-e03199a458f6\") " pod="openshift-marketplace/certified-operators-dn6gk" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.811722 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-stfh9"] Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.812691 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-stfh9" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.814512 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.818916 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-stfh9"] Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.902145 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86bd4a9-6a7b-432d-824c-e03199a458f6-utilities\") pod \"certified-operators-dn6gk\" (UID: \"e86bd4a9-6a7b-432d-824c-e03199a458f6\") " pod="openshift-marketplace/certified-operators-dn6gk" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.902218 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86bd4a9-6a7b-432d-824c-e03199a458f6-catalog-content\") pod \"certified-operators-dn6gk\" (UID: \"e86bd4a9-6a7b-432d-824c-e03199a458f6\") " pod="openshift-marketplace/certified-operators-dn6gk" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.902381 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3930fe-a227-4dbd-82ec-d9e95f06a317-catalog-content\") pod \"community-operators-stfh9\" (UID: \"6a3930fe-a227-4dbd-82ec-d9e95f06a317\") " pod="openshift-marketplace/community-operators-stfh9" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.902431 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ljf8\" (UniqueName: \"kubernetes.io/projected/e86bd4a9-6a7b-432d-824c-e03199a458f6-kube-api-access-9ljf8\") pod \"certified-operators-dn6gk\" (UID: \"e86bd4a9-6a7b-432d-824c-e03199a458f6\") " pod="openshift-marketplace/certified-operators-dn6gk" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.902491 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3930fe-a227-4dbd-82ec-d9e95f06a317-utilities\") pod \"community-operators-stfh9\" (UID: \"6a3930fe-a227-4dbd-82ec-d9e95f06a317\") " pod="openshift-marketplace/community-operators-stfh9" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.902529 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2pvp\" (UniqueName: \"kubernetes.io/projected/6a3930fe-a227-4dbd-82ec-d9e95f06a317-kube-api-access-b2pvp\") pod \"community-operators-stfh9\" (UID: \"6a3930fe-a227-4dbd-82ec-d9e95f06a317\") " pod="openshift-marketplace/community-operators-stfh9" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.904020 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86bd4a9-6a7b-432d-824c-e03199a458f6-catalog-content\") pod \"certified-operators-dn6gk\" (UID: \"e86bd4a9-6a7b-432d-824c-e03199a458f6\") " pod="openshift-marketplace/certified-operators-dn6gk" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.904274 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86bd4a9-6a7b-432d-824c-e03199a458f6-utilities\") pod \"certified-operators-dn6gk\" (UID: \"e86bd4a9-6a7b-432d-824c-e03199a458f6\") " pod="openshift-marketplace/certified-operators-dn6gk" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.937228 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ljf8\" (UniqueName: \"kubernetes.io/projected/e86bd4a9-6a7b-432d-824c-e03199a458f6-kube-api-access-9ljf8\") pod \"certified-operators-dn6gk\" (UID: \"e86bd4a9-6a7b-432d-824c-e03199a458f6\") " pod="openshift-marketplace/certified-operators-dn6gk" Mar 20 07:20:00 crc kubenswrapper[4749]: I0320 07:20:00.974307 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dn6gk" Mar 20 07:20:01 crc kubenswrapper[4749]: I0320 07:20:01.003686 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3930fe-a227-4dbd-82ec-d9e95f06a317-catalog-content\") pod \"community-operators-stfh9\" (UID: \"6a3930fe-a227-4dbd-82ec-d9e95f06a317\") " pod="openshift-marketplace/community-operators-stfh9" Mar 20 07:20:01 crc kubenswrapper[4749]: I0320 07:20:01.003744 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3930fe-a227-4dbd-82ec-d9e95f06a317-utilities\") pod \"community-operators-stfh9\" (UID: \"6a3930fe-a227-4dbd-82ec-d9e95f06a317\") " pod="openshift-marketplace/community-operators-stfh9" Mar 20 07:20:01 crc kubenswrapper[4749]: I0320 07:20:01.003766 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2pvp\" (UniqueName: \"kubernetes.io/projected/6a3930fe-a227-4dbd-82ec-d9e95f06a317-kube-api-access-b2pvp\") pod \"community-operators-stfh9\" (UID: \"6a3930fe-a227-4dbd-82ec-d9e95f06a317\") " pod="openshift-marketplace/community-operators-stfh9" Mar 20 07:20:01 crc kubenswrapper[4749]: I0320 07:20:01.005492 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3930fe-a227-4dbd-82ec-d9e95f06a317-utilities\") pod \"community-operators-stfh9\" (UID: \"6a3930fe-a227-4dbd-82ec-d9e95f06a317\") " pod="openshift-marketplace/community-operators-stfh9" Mar 20 07:20:01 crc kubenswrapper[4749]: I0320 07:20:01.005582 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3930fe-a227-4dbd-82ec-d9e95f06a317-catalog-content\") pod \"community-operators-stfh9\" (UID: \"6a3930fe-a227-4dbd-82ec-d9e95f06a317\") " pod="openshift-marketplace/community-operators-stfh9" Mar 20 07:20:01 crc kubenswrapper[4749]: I0320 07:20:01.025779 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2pvp\" (UniqueName: \"kubernetes.io/projected/6a3930fe-a227-4dbd-82ec-d9e95f06a317-kube-api-access-b2pvp\") pod \"community-operators-stfh9\" (UID: \"6a3930fe-a227-4dbd-82ec-d9e95f06a317\") " pod="openshift-marketplace/community-operators-stfh9" Mar 20 07:20:01 crc kubenswrapper[4749]: I0320 07:20:01.136625 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-stfh9" Mar 20 07:20:01 crc kubenswrapper[4749]: I0320 07:20:01.222125 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dn6gk"] Mar 20 07:20:01 crc kubenswrapper[4749]: W0320 07:20:01.236629 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode86bd4a9_6a7b_432d_824c_e03199a458f6.slice/crio-b84c313f117b3eab5569a744be2e96c0d625e6e26024af8995508f12eb4b80a7 WatchSource:0}: Error finding container b84c313f117b3eab5569a744be2e96c0d625e6e26024af8995508f12eb4b80a7: Status 404 returned error can't find the container with id b84c313f117b3eab5569a744be2e96c0d625e6e26024af8995508f12eb4b80a7 Mar 20 07:20:01 crc kubenswrapper[4749]: I0320 07:20:01.557219 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-stfh9"] Mar 20 07:20:01 crc kubenswrapper[4749]: W0320 07:20:01.561402 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a3930fe_a227_4dbd_82ec_d9e95f06a317.slice/crio-3eb2f72792ab96572facf7e27d140b48614adf25ba8d46a827a9656553982724 WatchSource:0}: Error finding container 3eb2f72792ab96572facf7e27d140b48614adf25ba8d46a827a9656553982724: Status 404 returned error can't find the container with id 3eb2f72792ab96572facf7e27d140b48614adf25ba8d46a827a9656553982724 Mar 20 07:20:01 crc kubenswrapper[4749]: I0320 07:20:01.767997 4749 generic.go:334] "Generic (PLEG): container finished" podID="6a3930fe-a227-4dbd-82ec-d9e95f06a317" containerID="bcf72696bf3ecaf814e13551a98eae22a7b100f381e250fe1783c5c42afc09b4" exitCode=0 Mar 20 07:20:01 crc kubenswrapper[4749]: I0320 07:20:01.768077 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-stfh9" event={"ID":"6a3930fe-a227-4dbd-82ec-d9e95f06a317","Type":"ContainerDied","Data":"bcf72696bf3ecaf814e13551a98eae22a7b100f381e250fe1783c5c42afc09b4"} Mar 20 07:20:01 crc kubenswrapper[4749]: I0320 07:20:01.768106 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-stfh9" event={"ID":"6a3930fe-a227-4dbd-82ec-d9e95f06a317","Type":"ContainerStarted","Data":"3eb2f72792ab96572facf7e27d140b48614adf25ba8d46a827a9656553982724"} Mar 20 07:20:01 crc kubenswrapper[4749]: I0320 07:20:01.769424 4749 generic.go:334] "Generic (PLEG): container finished" podID="e86bd4a9-6a7b-432d-824c-e03199a458f6" containerID="f555ebe7c537d2fb01a78487dc7b86f26c049aa32481333ee173da4df6901ff6" exitCode=0 Mar 20 07:20:01 crc kubenswrapper[4749]: I0320 07:20:01.770169 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dn6gk" event={"ID":"e86bd4a9-6a7b-432d-824c-e03199a458f6","Type":"ContainerDied","Data":"f555ebe7c537d2fb01a78487dc7b86f26c049aa32481333ee173da4df6901ff6"} Mar 20 07:20:01 crc kubenswrapper[4749]: I0320 07:20:01.770197 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dn6gk" event={"ID":"e86bd4a9-6a7b-432d-824c-e03199a458f6","Type":"ContainerStarted","Data":"b84c313f117b3eab5569a744be2e96c0d625e6e26024af8995508f12eb4b80a7"} Mar 20 07:20:02 crc kubenswrapper[4749]: I0320 07:20:02.777189 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566520-j7gn9" event={"ID":"2e201ab3-7a56-4786-867f-5beef3df85b8","Type":"ContainerStarted","Data":"2ad0c0b0ad6dc2f74c3f216b96baf7c0c450cc28f3b1a294cb26e0160d039a78"} Mar 20 07:20:02 crc kubenswrapper[4749]: I0320 07:20:02.780695 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dn6gk" event={"ID":"e86bd4a9-6a7b-432d-824c-e03199a458f6","Type":"ContainerStarted","Data":"0d269e9667333f43b92428f990e90b5b96435061e1d41c3b146d5c4a991d216b"} Mar 20 07:20:02 crc kubenswrapper[4749]: I0320 07:20:02.792621 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566520-j7gn9" podStartSLOduration=1.065655653 podStartE2EDuration="2.792604372s" podCreationTimestamp="2026-03-20 07:20:00 +0000 UTC" firstStartedPulling="2026-03-20 07:20:00.741884269 +0000 UTC m=+437.291541916" lastFinishedPulling="2026-03-20 07:20:02.468832988 +0000 UTC m=+439.018490635" observedRunningTime="2026-03-20 07:20:02.791155987 +0000 UTC m=+439.340813634" watchObservedRunningTime="2026-03-20 07:20:02.792604372 +0000 UTC m=+439.342262019" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.012177 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t2l62"] Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.013314 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t2l62" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.015640 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.027197 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t2l62"] Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.146873 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abf5fd3e-fb60-488e-9907-02dc8aa57901-catalog-content\") pod \"redhat-marketplace-t2l62\" (UID: \"abf5fd3e-fb60-488e-9907-02dc8aa57901\") " pod="openshift-marketplace/redhat-marketplace-t2l62" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.147049 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abf5fd3e-fb60-488e-9907-02dc8aa57901-utilities\") pod \"redhat-marketplace-t2l62\" (UID: \"abf5fd3e-fb60-488e-9907-02dc8aa57901\") " pod="openshift-marketplace/redhat-marketplace-t2l62" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.147245 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmhcq\" (UniqueName: \"kubernetes.io/projected/abf5fd3e-fb60-488e-9907-02dc8aa57901-kube-api-access-xmhcq\") pod \"redhat-marketplace-t2l62\" (UID: \"abf5fd3e-fb60-488e-9907-02dc8aa57901\") " pod="openshift-marketplace/redhat-marketplace-t2l62" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.215102 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mt952"] Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.216342 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mt952" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.217938 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.219133 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mt952"] Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.248144 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abf5fd3e-fb60-488e-9907-02dc8aa57901-catalog-content\") pod \"redhat-marketplace-t2l62\" (UID: \"abf5fd3e-fb60-488e-9907-02dc8aa57901\") " pod="openshift-marketplace/redhat-marketplace-t2l62" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.248195 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abf5fd3e-fb60-488e-9907-02dc8aa57901-utilities\") pod \"redhat-marketplace-t2l62\" (UID: \"abf5fd3e-fb60-488e-9907-02dc8aa57901\") " pod="openshift-marketplace/redhat-marketplace-t2l62" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.248241 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmhcq\" (UniqueName: \"kubernetes.io/projected/abf5fd3e-fb60-488e-9907-02dc8aa57901-kube-api-access-xmhcq\") pod \"redhat-marketplace-t2l62\" (UID: \"abf5fd3e-fb60-488e-9907-02dc8aa57901\") " pod="openshift-marketplace/redhat-marketplace-t2l62" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.248696 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abf5fd3e-fb60-488e-9907-02dc8aa57901-catalog-content\") pod \"redhat-marketplace-t2l62\" (UID: \"abf5fd3e-fb60-488e-9907-02dc8aa57901\") " pod="openshift-marketplace/redhat-marketplace-t2l62" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.248780 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abf5fd3e-fb60-488e-9907-02dc8aa57901-utilities\") pod \"redhat-marketplace-t2l62\" (UID: \"abf5fd3e-fb60-488e-9907-02dc8aa57901\") " pod="openshift-marketplace/redhat-marketplace-t2l62" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.269371 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmhcq\" (UniqueName: \"kubernetes.io/projected/abf5fd3e-fb60-488e-9907-02dc8aa57901-kube-api-access-xmhcq\") pod \"redhat-marketplace-t2l62\" (UID: \"abf5fd3e-fb60-488e-9907-02dc8aa57901\") " pod="openshift-marketplace/redhat-marketplace-t2l62" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.340389 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t2l62" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.348902 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3b30307-7d91-49c8-a5d4-79c1501c442f-catalog-content\") pod \"redhat-operators-mt952\" (UID: \"d3b30307-7d91-49c8-a5d4-79c1501c442f\") " pod="openshift-marketplace/redhat-operators-mt952" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.349061 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3b30307-7d91-49c8-a5d4-79c1501c442f-utilities\") pod \"redhat-operators-mt952\" (UID: \"d3b30307-7d91-49c8-a5d4-79c1501c442f\") " pod="openshift-marketplace/redhat-operators-mt952" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.349271 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k9w9\" (UniqueName: \"kubernetes.io/projected/d3b30307-7d91-49c8-a5d4-79c1501c442f-kube-api-access-2k9w9\") pod \"redhat-operators-mt952\" (UID: \"d3b30307-7d91-49c8-a5d4-79c1501c442f\") " pod="openshift-marketplace/redhat-operators-mt952" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.450024 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k9w9\" (UniqueName: \"kubernetes.io/projected/d3b30307-7d91-49c8-a5d4-79c1501c442f-kube-api-access-2k9w9\") pod \"redhat-operators-mt952\" (UID: \"d3b30307-7d91-49c8-a5d4-79c1501c442f\") " pod="openshift-marketplace/redhat-operators-mt952" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.450104 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3b30307-7d91-49c8-a5d4-79c1501c442f-catalog-content\") pod \"redhat-operators-mt952\" (UID: \"d3b30307-7d91-49c8-a5d4-79c1501c442f\") " pod="openshift-marketplace/redhat-operators-mt952" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.450142 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3b30307-7d91-49c8-a5d4-79c1501c442f-utilities\") pod \"redhat-operators-mt952\" (UID: \"d3b30307-7d91-49c8-a5d4-79c1501c442f\") " pod="openshift-marketplace/redhat-operators-mt952" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.450855 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3b30307-7d91-49c8-a5d4-79c1501c442f-catalog-content\") pod \"redhat-operators-mt952\" (UID: \"d3b30307-7d91-49c8-a5d4-79c1501c442f\") " pod="openshift-marketplace/redhat-operators-mt952" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.450945 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3b30307-7d91-49c8-a5d4-79c1501c442f-utilities\") pod \"redhat-operators-mt952\" (UID: \"d3b30307-7d91-49c8-a5d4-79c1501c442f\") " pod="openshift-marketplace/redhat-operators-mt952" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.473776 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k9w9\" (UniqueName: \"kubernetes.io/projected/d3b30307-7d91-49c8-a5d4-79c1501c442f-kube-api-access-2k9w9\") pod \"redhat-operators-mt952\" (UID: \"d3b30307-7d91-49c8-a5d4-79c1501c442f\") " pod="openshift-marketplace/redhat-operators-mt952" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.518818 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t2l62"] Mar 20 07:20:03 crc kubenswrapper[4749]: W0320 07:20:03.525653 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabf5fd3e_fb60_488e_9907_02dc8aa57901.slice/crio-f1ec7e0b9b077f81cf034cd5618b13627a8b8f2517e85b62c7f461810b5ed6a1 WatchSource:0}: Error finding container f1ec7e0b9b077f81cf034cd5618b13627a8b8f2517e85b62c7f461810b5ed6a1: Status 404 returned error can't find the container with id f1ec7e0b9b077f81cf034cd5618b13627a8b8f2517e85b62c7f461810b5ed6a1 Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.538174 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mt952" Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.715461 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mt952"] Mar 20 07:20:03 crc kubenswrapper[4749]: W0320 07:20:03.780724 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3b30307_7d91_49c8_a5d4_79c1501c442f.slice/crio-4f8644532e92b3186c22ce9dc2bc677e22559d16d6acb8cdd36c62a74f4d187a WatchSource:0}: Error finding container 4f8644532e92b3186c22ce9dc2bc677e22559d16d6acb8cdd36c62a74f4d187a: Status 404 returned error can't find the container with id 4f8644532e92b3186c22ce9dc2bc677e22559d16d6acb8cdd36c62a74f4d187a Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.793070 4749 generic.go:334] "Generic (PLEG): container finished" podID="abf5fd3e-fb60-488e-9907-02dc8aa57901" containerID="b2b5c869f39727f4ef1ca84a4bfe84c90e14795212b4f3a09a911584bf36f0bc" exitCode=0 Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.793126 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t2l62" event={"ID":"abf5fd3e-fb60-488e-9907-02dc8aa57901","Type":"ContainerDied","Data":"b2b5c869f39727f4ef1ca84a4bfe84c90e14795212b4f3a09a911584bf36f0bc"} Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.793151 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t2l62" event={"ID":"abf5fd3e-fb60-488e-9907-02dc8aa57901","Type":"ContainerStarted","Data":"f1ec7e0b9b077f81cf034cd5618b13627a8b8f2517e85b62c7f461810b5ed6a1"} Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.796137 4749 generic.go:334] "Generic (PLEG): container finished" podID="2e201ab3-7a56-4786-867f-5beef3df85b8" containerID="2ad0c0b0ad6dc2f74c3f216b96baf7c0c450cc28f3b1a294cb26e0160d039a78" exitCode=0 Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.796230 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566520-j7gn9" event={"ID":"2e201ab3-7a56-4786-867f-5beef3df85b8","Type":"ContainerDied","Data":"2ad0c0b0ad6dc2f74c3f216b96baf7c0c450cc28f3b1a294cb26e0160d039a78"} Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.798031 4749 generic.go:334] "Generic (PLEG): container finished" podID="6a3930fe-a227-4dbd-82ec-d9e95f06a317" containerID="3359aae1a50aa53f1c3e07fb925baadb6e1528b9841ba2771de24abd137f0b7b" exitCode=0 Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.798075 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-stfh9" event={"ID":"6a3930fe-a227-4dbd-82ec-d9e95f06a317","Type":"ContainerDied","Data":"3359aae1a50aa53f1c3e07fb925baadb6e1528b9841ba2771de24abd137f0b7b"} Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.800473 4749 generic.go:334] "Generic (PLEG): container finished" podID="e86bd4a9-6a7b-432d-824c-e03199a458f6" containerID="0d269e9667333f43b92428f990e90b5b96435061e1d41c3b146d5c4a991d216b" exitCode=0 Mar 20 07:20:03 crc kubenswrapper[4749]: I0320 07:20:03.800497 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dn6gk" event={"ID":"e86bd4a9-6a7b-432d-824c-e03199a458f6","Type":"ContainerDied","Data":"0d269e9667333f43b92428f990e90b5b96435061e1d41c3b146d5c4a991d216b"} Mar 20 07:20:04 crc kubenswrapper[4749]: I0320 07:20:04.518768 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:20:04 crc kubenswrapper[4749]: I0320 07:20:04.519150 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:20:04 crc kubenswrapper[4749]: I0320 07:20:04.519267 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:20:04 crc kubenswrapper[4749]: I0320 07:20:04.520068 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4c9937006b944b57a7ace3d87b4c4a8a6a9f78e9d693469869b65f6df516a69c"} pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 07:20:04 crc kubenswrapper[4749]: I0320 07:20:04.520184 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" containerID="cri-o://4c9937006b944b57a7ace3d87b4c4a8a6a9f78e9d693469869b65f6df516a69c" gracePeriod=600 Mar 20 07:20:04 crc kubenswrapper[4749]: I0320 07:20:04.808449 4749 generic.go:334] "Generic (PLEG): container finished" podID="d3b30307-7d91-49c8-a5d4-79c1501c442f" containerID="95ea9b6550c62eca7682bc813b29c51cf300b3db56fcf5d60bf0b313ce404a4d" exitCode=0 Mar 20 07:20:04 crc kubenswrapper[4749]: I0320 07:20:04.808683 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt952" event={"ID":"d3b30307-7d91-49c8-a5d4-79c1501c442f","Type":"ContainerDied","Data":"95ea9b6550c62eca7682bc813b29c51cf300b3db56fcf5d60bf0b313ce404a4d"} Mar 20 07:20:04 crc kubenswrapper[4749]: I0320 07:20:04.808911 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt952" event={"ID":"d3b30307-7d91-49c8-a5d4-79c1501c442f","Type":"ContainerStarted","Data":"4f8644532e92b3186c22ce9dc2bc677e22559d16d6acb8cdd36c62a74f4d187a"} Mar 20 07:20:04 crc kubenswrapper[4749]: I0320 07:20:04.811445 4749 generic.go:334] "Generic (PLEG): container finished" podID="abf5fd3e-fb60-488e-9907-02dc8aa57901" containerID="151bd74df3a500ca83745e2d478d0151b7453ad0a08a490c186c12c38b24be8a" exitCode=0 Mar 20 07:20:04 crc kubenswrapper[4749]: I0320 07:20:04.811524 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t2l62" event={"ID":"abf5fd3e-fb60-488e-9907-02dc8aa57901","Type":"ContainerDied","Data":"151bd74df3a500ca83745e2d478d0151b7453ad0a08a490c186c12c38b24be8a"} Mar 20 07:20:04 crc kubenswrapper[4749]: I0320 07:20:04.813425 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-stfh9" event={"ID":"6a3930fe-a227-4dbd-82ec-d9e95f06a317","Type":"ContainerStarted","Data":"f6b2fac2e461e50f05cef1faab4c29fdb7e31db074c484c4e0da070c7486bfb3"} Mar 20 07:20:04 crc kubenswrapper[4749]: I0320 07:20:04.817099 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dn6gk" event={"ID":"e86bd4a9-6a7b-432d-824c-e03199a458f6","Type":"ContainerStarted","Data":"4d610c758b0fd178b1ed771d4a42f065d10ba7ff608c79ffedcde7ffbfe14c85"} Mar 20 07:20:04 crc kubenswrapper[4749]: I0320 07:20:04.819680 4749 generic.go:334] "Generic (PLEG): container finished" podID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerID="4c9937006b944b57a7ace3d87b4c4a8a6a9f78e9d693469869b65f6df516a69c" exitCode=0 Mar 20 07:20:04 crc kubenswrapper[4749]: I0320 07:20:04.819777 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerDied","Data":"4c9937006b944b57a7ace3d87b4c4a8a6a9f78e9d693469869b65f6df516a69c"} Mar 20 07:20:04 crc kubenswrapper[4749]: I0320 07:20:04.819862 4749 scope.go:117] "RemoveContainer" containerID="e7e97608b8dbd15f9f6a4df363aa16c0f7e4a3d501a4182627876064290b63e9" Mar 20 07:20:04 crc kubenswrapper[4749]: I0320 07:20:04.866878 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-stfh9" podStartSLOduration=2.19910212 podStartE2EDuration="4.866863229s" podCreationTimestamp="2026-03-20 07:20:00 +0000 UTC" firstStartedPulling="2026-03-20 07:20:01.770745114 +0000 UTC m=+438.320402761" lastFinishedPulling="2026-03-20 07:20:04.438506223 +0000 UTC m=+440.988163870" observedRunningTime="2026-03-20 07:20:04.865801293 +0000 UTC m=+441.415458940" watchObservedRunningTime="2026-03-20 07:20:04.866863229 +0000 UTC m=+441.416520876" Mar 20 07:20:05 crc kubenswrapper[4749]: I0320 07:20:05.083177 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566520-j7gn9" Mar 20 07:20:05 crc kubenswrapper[4749]: I0320 07:20:05.109472 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dn6gk" podStartSLOduration=2.47896522 podStartE2EDuration="5.109453024s" podCreationTimestamp="2026-03-20 07:20:00 +0000 UTC" firstStartedPulling="2026-03-20 07:20:01.772069596 +0000 UTC m=+438.321727243" lastFinishedPulling="2026-03-20 07:20:04.40255739 +0000 UTC m=+440.952215047" observedRunningTime="2026-03-20 07:20:04.903615922 +0000 UTC m=+441.453273569" watchObservedRunningTime="2026-03-20 07:20:05.109453024 +0000 UTC m=+441.659110671" Mar 20 07:20:05 crc kubenswrapper[4749]: I0320 07:20:05.270978 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8gc5\" (UniqueName: \"kubernetes.io/projected/2e201ab3-7a56-4786-867f-5beef3df85b8-kube-api-access-d8gc5\") pod \"2e201ab3-7a56-4786-867f-5beef3df85b8\" (UID: \"2e201ab3-7a56-4786-867f-5beef3df85b8\") " Mar 20 07:20:05 crc kubenswrapper[4749]: I0320 07:20:05.281409 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e201ab3-7a56-4786-867f-5beef3df85b8-kube-api-access-d8gc5" (OuterVolumeSpecName: "kube-api-access-d8gc5") pod "2e201ab3-7a56-4786-867f-5beef3df85b8" (UID: "2e201ab3-7a56-4786-867f-5beef3df85b8"). InnerVolumeSpecName "kube-api-access-d8gc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:20:05 crc kubenswrapper[4749]: I0320 07:20:05.373491 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8gc5\" (UniqueName: \"kubernetes.io/projected/2e201ab3-7a56-4786-867f-5beef3df85b8-kube-api-access-d8gc5\") on node \"crc\" DevicePath \"\"" Mar 20 07:20:05 crc kubenswrapper[4749]: I0320 07:20:05.826802 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerStarted","Data":"2b2a2005813627d86d64d4f38a80b5429cc2b61afa4394e9316e568f5d812e7d"} Mar 20 07:20:05 crc kubenswrapper[4749]: I0320 07:20:05.828459 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt952" event={"ID":"d3b30307-7d91-49c8-a5d4-79c1501c442f","Type":"ContainerStarted","Data":"9365088ce57c83322b081a0fe202f5a04e7e3279d5f8d7a4094ff2b4324cb105"} Mar 20 07:20:05 crc kubenswrapper[4749]: I0320 07:20:05.831330 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t2l62" event={"ID":"abf5fd3e-fb60-488e-9907-02dc8aa57901","Type":"ContainerStarted","Data":"f404622318de2b4aa9d3571e083f64b93e9d5239b0c3e9cb27c3766b1a5d995a"} Mar 20 07:20:05 crc kubenswrapper[4749]: I0320 07:20:05.833102 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566520-j7gn9" Mar 20 07:20:05 crc kubenswrapper[4749]: I0320 07:20:05.841602 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566520-j7gn9" event={"ID":"2e201ab3-7a56-4786-867f-5beef3df85b8","Type":"ContainerDied","Data":"8fdd15c6c0c54d0e7256c668273b539d4981daf07ab9aee62c4f161254b7ae5a"} Mar 20 07:20:05 crc kubenswrapper[4749]: I0320 07:20:05.841636 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fdd15c6c0c54d0e7256c668273b539d4981daf07ab9aee62c4f161254b7ae5a" Mar 20 07:20:05 crc kubenswrapper[4749]: I0320 07:20:05.869888 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t2l62" podStartSLOduration=2.234845784 podStartE2EDuration="3.869873674s" podCreationTimestamp="2026-03-20 07:20:02 +0000 UTC" firstStartedPulling="2026-03-20 07:20:03.801839686 +0000 UTC m=+440.351497333" lastFinishedPulling="2026-03-20 07:20:05.436867576 +0000 UTC m=+441.986525223" observedRunningTime="2026-03-20 07:20:05.86722115 +0000 UTC m=+442.416878797" watchObservedRunningTime="2026-03-20 07:20:05.869873674 +0000 UTC m=+442.419531321" Mar 20 07:20:06 crc kubenswrapper[4749]: I0320 07:20:06.840590 4749 generic.go:334] "Generic (PLEG): container finished" podID="d3b30307-7d91-49c8-a5d4-79c1501c442f" containerID="9365088ce57c83322b081a0fe202f5a04e7e3279d5f8d7a4094ff2b4324cb105" exitCode=0 Mar 20 07:20:06 crc kubenswrapper[4749]: I0320 07:20:06.840695 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt952" event={"ID":"d3b30307-7d91-49c8-a5d4-79c1501c442f","Type":"ContainerDied","Data":"9365088ce57c83322b081a0fe202f5a04e7e3279d5f8d7a4094ff2b4324cb105"} Mar 20 07:20:07 crc kubenswrapper[4749]: I0320 07:20:07.850568 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt952" event={"ID":"d3b30307-7d91-49c8-a5d4-79c1501c442f","Type":"ContainerStarted","Data":"76cd697f14cd8e7894263792fdb9e54037408463fa9552427ff92e2cd2377bc8"} Mar 20 07:20:07 crc kubenswrapper[4749]: I0320 07:20:07.871817 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mt952" podStartSLOduration=2.346561297 podStartE2EDuration="4.871803934s" podCreationTimestamp="2026-03-20 07:20:03 +0000 UTC" firstStartedPulling="2026-03-20 07:20:04.810548297 +0000 UTC m=+441.360205944" lastFinishedPulling="2026-03-20 07:20:07.335790934 +0000 UTC m=+443.885448581" observedRunningTime="2026-03-20 07:20:07.86955254 +0000 UTC m=+444.419210187" watchObservedRunningTime="2026-03-20 07:20:07.871803934 +0000 UTC m=+444.421461581" Mar 20 07:20:10 crc kubenswrapper[4749]: I0320 07:20:10.974783 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dn6gk" Mar 20 07:20:10 crc kubenswrapper[4749]: I0320 07:20:10.975118 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dn6gk" Mar 20 07:20:11 crc kubenswrapper[4749]: I0320 07:20:11.047050 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dn6gk" Mar 20 07:20:11 crc kubenswrapper[4749]: I0320 07:20:11.138012 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-stfh9" Mar 20 07:20:11 crc kubenswrapper[4749]: I0320 07:20:11.138092 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-stfh9" Mar 20 07:20:11 crc kubenswrapper[4749]: I0320 07:20:11.199723 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-stfh9" Mar 20 07:20:11 crc kubenswrapper[4749]: I0320 07:20:11.931262 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dn6gk" Mar 20 07:20:11 crc kubenswrapper[4749]: I0320 07:20:11.943412 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-stfh9" Mar 20 07:20:13 crc kubenswrapper[4749]: I0320 07:20:13.340669 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t2l62" Mar 20 07:20:13 crc kubenswrapper[4749]: I0320 07:20:13.341222 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t2l62" Mar 20 07:20:13 crc kubenswrapper[4749]: I0320 07:20:13.413259 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t2l62" Mar 20 07:20:13 crc kubenswrapper[4749]: I0320 07:20:13.538337 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mt952" Mar 20 07:20:13 crc kubenswrapper[4749]: I0320 07:20:13.539040 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mt952" Mar 20 07:20:13 crc kubenswrapper[4749]: I0320 07:20:13.575431 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-2q8qh" Mar 20 07:20:13 crc kubenswrapper[4749]: I0320 07:20:13.631803 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zz6kk"] Mar 20 07:20:13 crc kubenswrapper[4749]: I0320 07:20:13.940101 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t2l62" Mar 20 07:20:14 crc kubenswrapper[4749]: I0320 07:20:14.580881 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mt952" podUID="d3b30307-7d91-49c8-a5d4-79c1501c442f" containerName="registry-server" probeResult="failure" output=< Mar 20 07:20:14 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Mar 20 07:20:14 crc kubenswrapper[4749]: > Mar 20 07:20:23 crc kubenswrapper[4749]: I0320 07:20:23.605651 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mt952" Mar 20 07:20:23 crc kubenswrapper[4749]: I0320 07:20:23.669828 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mt952" Mar 20 07:20:38 crc kubenswrapper[4749]: I0320 07:20:38.683125 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" podUID="473085e8-ee17-4244-abd0-dcf2308b4655" containerName="registry" containerID="cri-o://b80e21067c1aaa7790ff4209fbe44d9983502a44bcb10a8772fd02b4457779a8" gracePeriod=30 Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.004203 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.040421 4749 generic.go:334] "Generic (PLEG): container finished" podID="473085e8-ee17-4244-abd0-dcf2308b4655" containerID="b80e21067c1aaa7790ff4209fbe44d9983502a44bcb10a8772fd02b4457779a8" exitCode=0 Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.040460 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" event={"ID":"473085e8-ee17-4244-abd0-dcf2308b4655","Type":"ContainerDied","Data":"b80e21067c1aaa7790ff4209fbe44d9983502a44bcb10a8772fd02b4457779a8"} Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.040483 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.040496 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zz6kk" event={"ID":"473085e8-ee17-4244-abd0-dcf2308b4655","Type":"ContainerDied","Data":"0fbe29db15f5ac137c6b49137c50d1e1112b8b4b098a29b89bd6f3bebb487a8c"} Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.040516 4749 scope.go:117] "RemoveContainer" containerID="b80e21067c1aaa7790ff4209fbe44d9983502a44bcb10a8772fd02b4457779a8" Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.057365 4749 scope.go:117] "RemoveContainer" containerID="b80e21067c1aaa7790ff4209fbe44d9983502a44bcb10a8772fd02b4457779a8" Mar 20 07:20:39 crc kubenswrapper[4749]: E0320 07:20:39.057946 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b80e21067c1aaa7790ff4209fbe44d9983502a44bcb10a8772fd02b4457779a8\": container with ID starting with b80e21067c1aaa7790ff4209fbe44d9983502a44bcb10a8772fd02b4457779a8 not found: ID does not exist" containerID="b80e21067c1aaa7790ff4209fbe44d9983502a44bcb10a8772fd02b4457779a8" Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.058001 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80e21067c1aaa7790ff4209fbe44d9983502a44bcb10a8772fd02b4457779a8"} err="failed to get container status \"b80e21067c1aaa7790ff4209fbe44d9983502a44bcb10a8772fd02b4457779a8\": rpc error: code = NotFound desc = could not find container \"b80e21067c1aaa7790ff4209fbe44d9983502a44bcb10a8772fd02b4457779a8\": container with ID starting with b80e21067c1aaa7790ff4209fbe44d9983502a44bcb10a8772fd02b4457779a8 not found: ID does not exist" Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.074784 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"473085e8-ee17-4244-abd0-dcf2308b4655\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.074818 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xmkj\" (UniqueName: \"kubernetes.io/projected/473085e8-ee17-4244-abd0-dcf2308b4655-kube-api-access-8xmkj\") pod \"473085e8-ee17-4244-abd0-dcf2308b4655\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.074844 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/473085e8-ee17-4244-abd0-dcf2308b4655-registry-tls\") pod \"473085e8-ee17-4244-abd0-dcf2308b4655\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.074862 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/473085e8-ee17-4244-abd0-dcf2308b4655-installation-pull-secrets\") pod \"473085e8-ee17-4244-abd0-dcf2308b4655\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.074880 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/473085e8-ee17-4244-abd0-dcf2308b4655-trusted-ca\") pod \"473085e8-ee17-4244-abd0-dcf2308b4655\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.074911 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/473085e8-ee17-4244-abd0-dcf2308b4655-bound-sa-token\") pod \"473085e8-ee17-4244-abd0-dcf2308b4655\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.074937 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/473085e8-ee17-4244-abd0-dcf2308b4655-registry-certificates\") pod \"473085e8-ee17-4244-abd0-dcf2308b4655\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.074969 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/473085e8-ee17-4244-abd0-dcf2308b4655-ca-trust-extracted\") pod \"473085e8-ee17-4244-abd0-dcf2308b4655\" (UID: \"473085e8-ee17-4244-abd0-dcf2308b4655\") " Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.075668 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/473085e8-ee17-4244-abd0-dcf2308b4655-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "473085e8-ee17-4244-abd0-dcf2308b4655" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.076216 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/473085e8-ee17-4244-abd0-dcf2308b4655-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "473085e8-ee17-4244-abd0-dcf2308b4655" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.081526 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/473085e8-ee17-4244-abd0-dcf2308b4655-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "473085e8-ee17-4244-abd0-dcf2308b4655" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.081838 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/473085e8-ee17-4244-abd0-dcf2308b4655-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "473085e8-ee17-4244-abd0-dcf2308b4655" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.084021 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/473085e8-ee17-4244-abd0-dcf2308b4655-kube-api-access-8xmkj" (OuterVolumeSpecName: "kube-api-access-8xmkj") pod "473085e8-ee17-4244-abd0-dcf2308b4655" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655"). InnerVolumeSpecName "kube-api-access-8xmkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.086258 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "473085e8-ee17-4244-abd0-dcf2308b4655" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.095044 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/473085e8-ee17-4244-abd0-dcf2308b4655-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "473085e8-ee17-4244-abd0-dcf2308b4655" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.095830 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/473085e8-ee17-4244-abd0-dcf2308b4655-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "473085e8-ee17-4244-abd0-dcf2308b4655" (UID: "473085e8-ee17-4244-abd0-dcf2308b4655"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.176190 4749 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/473085e8-ee17-4244-abd0-dcf2308b4655-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.176214 4749 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/473085e8-ee17-4244-abd0-dcf2308b4655-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.176222 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xmkj\" (UniqueName: \"kubernetes.io/projected/473085e8-ee17-4244-abd0-dcf2308b4655-kube-api-access-8xmkj\") on node \"crc\" DevicePath \"\"" Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.176231 4749 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/473085e8-ee17-4244-abd0-dcf2308b4655-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.176241 4749 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/473085e8-ee17-4244-abd0-dcf2308b4655-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.176249 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/473085e8-ee17-4244-abd0-dcf2308b4655-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.176256 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/473085e8-ee17-4244-abd0-dcf2308b4655-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.372911 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zz6kk"] Mar 20 07:20:39 crc kubenswrapper[4749]: I0320 07:20:39.376829 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zz6kk"] Mar 20 07:20:40 crc kubenswrapper[4749]: I0320 07:20:40.187802 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="473085e8-ee17-4244-abd0-dcf2308b4655" path="/var/lib/kubelet/pods/473085e8-ee17-4244-abd0-dcf2308b4655/volumes" Mar 20 07:20:52 crc kubenswrapper[4749]: I0320 07:20:52.449175 4749 scope.go:117] "RemoveContainer" containerID="688e8fa067ea553fac09be724c46f16706c8b3463f09d6a4e2cfe3212027da17" Mar 20 07:22:00 crc kubenswrapper[4749]: I0320 07:22:00.149554 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566522-f2md2"] Mar 20 07:22:00 crc kubenswrapper[4749]: E0320 07:22:00.150908 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473085e8-ee17-4244-abd0-dcf2308b4655" containerName="registry" Mar 20 07:22:00 crc kubenswrapper[4749]: I0320 07:22:00.150942 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="473085e8-ee17-4244-abd0-dcf2308b4655" containerName="registry" Mar 20 07:22:00 crc kubenswrapper[4749]: E0320 07:22:00.150968 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e201ab3-7a56-4786-867f-5beef3df85b8" containerName="oc" Mar 20 07:22:00 crc kubenswrapper[4749]: I0320 07:22:00.150984 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e201ab3-7a56-4786-867f-5beef3df85b8" containerName="oc" Mar 20 07:22:00 crc kubenswrapper[4749]: I0320 07:22:00.151216 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e201ab3-7a56-4786-867f-5beef3df85b8" containerName="oc" Mar 20 07:22:00 crc kubenswrapper[4749]: I0320 07:22:00.151250 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="473085e8-ee17-4244-abd0-dcf2308b4655" containerName="registry" Mar 20 07:22:00 crc kubenswrapper[4749]: I0320 07:22:00.152193 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566522-f2md2" Mar 20 07:22:00 crc kubenswrapper[4749]: I0320 07:22:00.155369 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566522-f2md2"] Mar 20 07:22:00 crc kubenswrapper[4749]: I0320 07:22:00.155417 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:22:00 crc kubenswrapper[4749]: I0320 07:22:00.155476 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:22:00 crc kubenswrapper[4749]: I0320 07:22:00.155528 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 07:22:00 crc kubenswrapper[4749]: I0320 07:22:00.211548 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlfhn\" (UniqueName: \"kubernetes.io/projected/80a300a4-78f9-4407-af0b-60e66f310b87-kube-api-access-mlfhn\") pod \"auto-csr-approver-29566522-f2md2\" (UID: \"80a300a4-78f9-4407-af0b-60e66f310b87\") " pod="openshift-infra/auto-csr-approver-29566522-f2md2" Mar 20 07:22:00 crc kubenswrapper[4749]: I0320 07:22:00.312810 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlfhn\" (UniqueName: \"kubernetes.io/projected/80a300a4-78f9-4407-af0b-60e66f310b87-kube-api-access-mlfhn\") pod \"auto-csr-approver-29566522-f2md2\" (UID: \"80a300a4-78f9-4407-af0b-60e66f310b87\") " pod="openshift-infra/auto-csr-approver-29566522-f2md2" Mar 20 07:22:00 crc kubenswrapper[4749]: I0320 07:22:00.346601 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlfhn\" (UniqueName: \"kubernetes.io/projected/80a300a4-78f9-4407-af0b-60e66f310b87-kube-api-access-mlfhn\") pod \"auto-csr-approver-29566522-f2md2\" (UID: \"80a300a4-78f9-4407-af0b-60e66f310b87\") " pod="openshift-infra/auto-csr-approver-29566522-f2md2" Mar 20 07:22:00 crc kubenswrapper[4749]: I0320 07:22:00.490727 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566522-f2md2" Mar 20 07:22:00 crc kubenswrapper[4749]: I0320 07:22:00.727417 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566522-f2md2"] Mar 20 07:22:00 crc kubenswrapper[4749]: I0320 07:22:00.738467 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 07:22:01 crc kubenswrapper[4749]: I0320 07:22:01.603416 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566522-f2md2" event={"ID":"80a300a4-78f9-4407-af0b-60e66f310b87","Type":"ContainerStarted","Data":"91fcb8e80310b68624daf73d86907db5400a57a089b7563f28ba3d80095aa612"} Mar 20 07:22:02 crc kubenswrapper[4749]: I0320 07:22:02.610746 4749 generic.go:334] "Generic (PLEG): container finished" podID="80a300a4-78f9-4407-af0b-60e66f310b87" containerID="ce9797d774005ed22ac91896d8e0594d834d974f75b6894990b7f17919a4a2db" exitCode=0 Mar 20 07:22:02 crc kubenswrapper[4749]: I0320 07:22:02.611060 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566522-f2md2" event={"ID":"80a300a4-78f9-4407-af0b-60e66f310b87","Type":"ContainerDied","Data":"ce9797d774005ed22ac91896d8e0594d834d974f75b6894990b7f17919a4a2db"} Mar 20 07:22:03 crc kubenswrapper[4749]: I0320 07:22:03.815327 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566522-f2md2" Mar 20 07:22:03 crc kubenswrapper[4749]: I0320 07:22:03.867742 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlfhn\" (UniqueName: \"kubernetes.io/projected/80a300a4-78f9-4407-af0b-60e66f310b87-kube-api-access-mlfhn\") pod \"80a300a4-78f9-4407-af0b-60e66f310b87\" (UID: \"80a300a4-78f9-4407-af0b-60e66f310b87\") " Mar 20 07:22:03 crc kubenswrapper[4749]: I0320 07:22:03.873342 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80a300a4-78f9-4407-af0b-60e66f310b87-kube-api-access-mlfhn" (OuterVolumeSpecName: "kube-api-access-mlfhn") pod "80a300a4-78f9-4407-af0b-60e66f310b87" (UID: "80a300a4-78f9-4407-af0b-60e66f310b87"). InnerVolumeSpecName "kube-api-access-mlfhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:22:03 crc kubenswrapper[4749]: I0320 07:22:03.969887 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlfhn\" (UniqueName: \"kubernetes.io/projected/80a300a4-78f9-4407-af0b-60e66f310b87-kube-api-access-mlfhn\") on node \"crc\" DevicePath \"\"" Mar 20 07:22:04 crc kubenswrapper[4749]: I0320 07:22:04.514630 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:22:04 crc kubenswrapper[4749]: I0320 07:22:04.514707 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:22:04 crc kubenswrapper[4749]: I0320 07:22:04.623160 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566522-f2md2" event={"ID":"80a300a4-78f9-4407-af0b-60e66f310b87","Type":"ContainerDied","Data":"91fcb8e80310b68624daf73d86907db5400a57a089b7563f28ba3d80095aa612"} Mar 20 07:22:04 crc kubenswrapper[4749]: I0320 07:22:04.623530 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91fcb8e80310b68624daf73d86907db5400a57a089b7563f28ba3d80095aa612" Mar 20 07:22:04 crc kubenswrapper[4749]: I0320 07:22:04.623249 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566522-f2md2" Mar 20 07:22:04 crc kubenswrapper[4749]: I0320 07:22:04.876683 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566516-6stbk"] Mar 20 07:22:04 crc kubenswrapper[4749]: I0320 07:22:04.879442 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566516-6stbk"] Mar 20 07:22:06 crc kubenswrapper[4749]: I0320 07:22:06.185763 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da95cd86-f90a-4d7f-a308-4124b22d8427" path="/var/lib/kubelet/pods/da95cd86-f90a-4d7f-a308-4124b22d8427/volumes" Mar 20 07:22:34 crc kubenswrapper[4749]: I0320 07:22:34.514392 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:22:34 crc kubenswrapper[4749]: I0320 07:22:34.515156 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:22:52 crc kubenswrapper[4749]: I0320 07:22:52.579022 4749 scope.go:117] "RemoveContainer" containerID="8d8776c7ee0b6c89af483156c57775e92b79920a72800eb0f3d1b5e91e32ffa3" Mar 20 07:23:04 crc kubenswrapper[4749]: I0320 07:23:04.514794 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:23:04 crc kubenswrapper[4749]: I0320 07:23:04.515651 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:23:04 crc kubenswrapper[4749]: I0320 07:23:04.515733 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:23:04 crc kubenswrapper[4749]: I0320 07:23:04.516721 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2b2a2005813627d86d64d4f38a80b5429cc2b61afa4394e9316e568f5d812e7d"} pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 07:23:04 crc kubenswrapper[4749]: I0320 07:23:04.516840 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" containerID="cri-o://2b2a2005813627d86d64d4f38a80b5429cc2b61afa4394e9316e568f5d812e7d" gracePeriod=600 Mar 20 07:23:05 crc kubenswrapper[4749]: I0320 07:23:05.072575 4749 generic.go:334] "Generic (PLEG): container finished" podID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerID="2b2a2005813627d86d64d4f38a80b5429cc2b61afa4394e9316e568f5d812e7d" exitCode=0 Mar 20 07:23:05 crc kubenswrapper[4749]: I0320 07:23:05.072657 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerDied","Data":"2b2a2005813627d86d64d4f38a80b5429cc2b61afa4394e9316e568f5d812e7d"} Mar 20 07:23:05 crc kubenswrapper[4749]: I0320 07:23:05.073052 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerStarted","Data":"5e762cc7631bdd3af893f7f6b529361ab634432f285e3c9f01638e40b5f29d64"} Mar 20 07:23:05 crc kubenswrapper[4749]: I0320 07:23:05.073088 4749 scope.go:117] "RemoveContainer" containerID="4c9937006b944b57a7ace3d87b4c4a8a6a9f78e9d693469869b65f6df516a69c" Mar 20 07:23:52 crc kubenswrapper[4749]: I0320 07:23:52.640719 4749 scope.go:117] "RemoveContainer" containerID="5f79f11fe5f1911b3210b36c9a630f224a7c92db0f2ba3a961bdb7d93f736d32" Mar 20 07:23:52 crc kubenswrapper[4749]: I0320 07:23:52.669035 4749 scope.go:117] "RemoveContainer" containerID="dc31afc519505de07be42767bdf62bf2b5ca3a71c120b605dc393154acc3985b" Mar 20 07:23:52 crc kubenswrapper[4749]: I0320 07:23:52.722010 4749 scope.go:117] "RemoveContainer" containerID="96da0f68eb00a3a5cc59f119b7f6dff435df8cc1ef1ec0e0d73cac4bbb0ad58c" Mar 20 07:24:00 crc kubenswrapper[4749]: I0320 07:24:00.146667 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566524-csb8r"] Mar 20 07:24:00 crc kubenswrapper[4749]: E0320 07:24:00.147419 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80a300a4-78f9-4407-af0b-60e66f310b87" containerName="oc" Mar 20 07:24:00 crc kubenswrapper[4749]: I0320 07:24:00.147434 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="80a300a4-78f9-4407-af0b-60e66f310b87" containerName="oc" Mar 20 07:24:00 crc kubenswrapper[4749]: I0320 07:24:00.147706 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="80a300a4-78f9-4407-af0b-60e66f310b87" containerName="oc" Mar 20 07:24:00 crc kubenswrapper[4749]: I0320 07:24:00.148124 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566524-csb8r" Mar 20 07:24:00 crc kubenswrapper[4749]: I0320 07:24:00.151104 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:24:00 crc kubenswrapper[4749]: I0320 07:24:00.151266 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 07:24:00 crc kubenswrapper[4749]: I0320 07:24:00.151355 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:24:00 crc kubenswrapper[4749]: I0320 07:24:00.162841 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566524-csb8r"] Mar 20 07:24:00 crc kubenswrapper[4749]: I0320 07:24:00.300670 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fckh\" (UniqueName: \"kubernetes.io/projected/c6d3a1eb-794e-4c9c-988b-9aef650c37b0-kube-api-access-7fckh\") pod \"auto-csr-approver-29566524-csb8r\" (UID: \"c6d3a1eb-794e-4c9c-988b-9aef650c37b0\") " pod="openshift-infra/auto-csr-approver-29566524-csb8r" Mar 20 07:24:00 crc kubenswrapper[4749]: I0320 07:24:00.403017 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fckh\" (UniqueName: \"kubernetes.io/projected/c6d3a1eb-794e-4c9c-988b-9aef650c37b0-kube-api-access-7fckh\") pod \"auto-csr-approver-29566524-csb8r\" (UID: \"c6d3a1eb-794e-4c9c-988b-9aef650c37b0\") " pod="openshift-infra/auto-csr-approver-29566524-csb8r" Mar 20 07:24:00 crc kubenswrapper[4749]: I0320 07:24:00.438675 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fckh\" (UniqueName: \"kubernetes.io/projected/c6d3a1eb-794e-4c9c-988b-9aef650c37b0-kube-api-access-7fckh\") pod \"auto-csr-approver-29566524-csb8r\" (UID: \"c6d3a1eb-794e-4c9c-988b-9aef650c37b0\") " pod="openshift-infra/auto-csr-approver-29566524-csb8r" Mar 20 07:24:00 crc kubenswrapper[4749]: I0320 07:24:00.473616 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566524-csb8r" Mar 20 07:24:00 crc kubenswrapper[4749]: I0320 07:24:00.741710 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566524-csb8r"] Mar 20 07:24:00 crc kubenswrapper[4749]: W0320 07:24:00.752888 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6d3a1eb_794e_4c9c_988b_9aef650c37b0.slice/crio-92616c7c06d3bea4621907609e1f010166109b3b5a4f65448350125cd0d08f40 WatchSource:0}: Error finding container 92616c7c06d3bea4621907609e1f010166109b3b5a4f65448350125cd0d08f40: Status 404 returned error can't find the container with id 92616c7c06d3bea4621907609e1f010166109b3b5a4f65448350125cd0d08f40 Mar 20 07:24:01 crc kubenswrapper[4749]: I0320 07:24:01.463908 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566524-csb8r" event={"ID":"c6d3a1eb-794e-4c9c-988b-9aef650c37b0","Type":"ContainerStarted","Data":"92616c7c06d3bea4621907609e1f010166109b3b5a4f65448350125cd0d08f40"} Mar 20 07:24:02 crc kubenswrapper[4749]: I0320 07:24:02.475789 4749 generic.go:334] "Generic (PLEG): container finished" podID="c6d3a1eb-794e-4c9c-988b-9aef650c37b0" containerID="e41808d9265ea41ff96884562bc18f2b7f0078f95e505a187b00de0ecda610df" exitCode=0 Mar 20 07:24:02 crc kubenswrapper[4749]: I0320 07:24:02.475849 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566524-csb8r" event={"ID":"c6d3a1eb-794e-4c9c-988b-9aef650c37b0","Type":"ContainerDied","Data":"e41808d9265ea41ff96884562bc18f2b7f0078f95e505a187b00de0ecda610df"} Mar 20 07:24:03 crc kubenswrapper[4749]: I0320 07:24:03.780572 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566524-csb8r" Mar 20 07:24:03 crc kubenswrapper[4749]: I0320 07:24:03.950627 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fckh\" (UniqueName: \"kubernetes.io/projected/c6d3a1eb-794e-4c9c-988b-9aef650c37b0-kube-api-access-7fckh\") pod \"c6d3a1eb-794e-4c9c-988b-9aef650c37b0\" (UID: \"c6d3a1eb-794e-4c9c-988b-9aef650c37b0\") " Mar 20 07:24:03 crc kubenswrapper[4749]: I0320 07:24:03.961621 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6d3a1eb-794e-4c9c-988b-9aef650c37b0-kube-api-access-7fckh" (OuterVolumeSpecName: "kube-api-access-7fckh") pod "c6d3a1eb-794e-4c9c-988b-9aef650c37b0" (UID: "c6d3a1eb-794e-4c9c-988b-9aef650c37b0"). InnerVolumeSpecName "kube-api-access-7fckh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:24:04 crc kubenswrapper[4749]: I0320 07:24:04.051895 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fckh\" (UniqueName: \"kubernetes.io/projected/c6d3a1eb-794e-4c9c-988b-9aef650c37b0-kube-api-access-7fckh\") on node \"crc\" DevicePath \"\"" Mar 20 07:24:04 crc kubenswrapper[4749]: I0320 07:24:04.489828 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566524-csb8r" event={"ID":"c6d3a1eb-794e-4c9c-988b-9aef650c37b0","Type":"ContainerDied","Data":"92616c7c06d3bea4621907609e1f010166109b3b5a4f65448350125cd0d08f40"} Mar 20 07:24:04 crc kubenswrapper[4749]: I0320 07:24:04.489868 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92616c7c06d3bea4621907609e1f010166109b3b5a4f65448350125cd0d08f40" Mar 20 07:24:04 crc kubenswrapper[4749]: I0320 07:24:04.489964 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566524-csb8r" Mar 20 07:24:04 crc kubenswrapper[4749]: I0320 07:24:04.859055 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566518-llhfx"] Mar 20 07:24:04 crc kubenswrapper[4749]: I0320 07:24:04.868269 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566518-llhfx"] Mar 20 07:24:06 crc kubenswrapper[4749]: I0320 07:24:06.190556 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fef1b07-a814-496c-913e-301e76688b96" path="/var/lib/kubelet/pods/0fef1b07-a814-496c-913e-301e76688b96/volumes" Mar 20 07:24:52 crc kubenswrapper[4749]: I0320 07:24:52.799475 4749 scope.go:117] "RemoveContainer" containerID="8d8d5ff9976d99ae7241f0b9fbb00a5a21a77065bbbc9798582616722f36caf2" Mar 20 07:25:04 crc kubenswrapper[4749]: I0320 07:25:04.514874 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:25:04 crc kubenswrapper[4749]: I0320 07:25:04.515709 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:25:34 crc kubenswrapper[4749]: I0320 07:25:34.514957 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:25:34 crc kubenswrapper[4749]: I0320 07:25:34.515563 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:26:00 crc kubenswrapper[4749]: I0320 07:26:00.147680 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566526-62x7p"] Mar 20 07:26:00 crc kubenswrapper[4749]: E0320 07:26:00.148702 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6d3a1eb-794e-4c9c-988b-9aef650c37b0" containerName="oc" Mar 20 07:26:00 crc kubenswrapper[4749]: I0320 07:26:00.148730 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6d3a1eb-794e-4c9c-988b-9aef650c37b0" containerName="oc" Mar 20 07:26:00 crc kubenswrapper[4749]: I0320 07:26:00.148969 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6d3a1eb-794e-4c9c-988b-9aef650c37b0" containerName="oc" Mar 20 07:26:00 crc kubenswrapper[4749]: I0320 07:26:00.149817 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566526-62x7p" Mar 20 07:26:00 crc kubenswrapper[4749]: I0320 07:26:00.152941 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:26:00 crc kubenswrapper[4749]: I0320 07:26:00.153126 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:26:00 crc kubenswrapper[4749]: I0320 07:26:00.153588 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 07:26:00 crc kubenswrapper[4749]: I0320 07:26:00.164829 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566526-62x7p"] Mar 20 07:26:00 crc kubenswrapper[4749]: I0320 07:26:00.318618 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ngvp\" (UniqueName: \"kubernetes.io/projected/fc55663c-b6fa-419c-a9a3-2f4234b8f27d-kube-api-access-8ngvp\") pod \"auto-csr-approver-29566526-62x7p\" (UID: \"fc55663c-b6fa-419c-a9a3-2f4234b8f27d\") " pod="openshift-infra/auto-csr-approver-29566526-62x7p" Mar 20 07:26:00 crc kubenswrapper[4749]: I0320 07:26:00.423059 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ngvp\" (UniqueName: \"kubernetes.io/projected/fc55663c-b6fa-419c-a9a3-2f4234b8f27d-kube-api-access-8ngvp\") pod \"auto-csr-approver-29566526-62x7p\" (UID: \"fc55663c-b6fa-419c-a9a3-2f4234b8f27d\") " pod="openshift-infra/auto-csr-approver-29566526-62x7p" Mar 20 07:26:00 crc kubenswrapper[4749]: I0320 07:26:00.456725 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ngvp\" (UniqueName: \"kubernetes.io/projected/fc55663c-b6fa-419c-a9a3-2f4234b8f27d-kube-api-access-8ngvp\") pod \"auto-csr-approver-29566526-62x7p\" (UID: \"fc55663c-b6fa-419c-a9a3-2f4234b8f27d\") " pod="openshift-infra/auto-csr-approver-29566526-62x7p" Mar 20 07:26:00 crc kubenswrapper[4749]: I0320 07:26:00.497559 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566526-62x7p" Mar 20 07:26:00 crc kubenswrapper[4749]: I0320 07:26:00.778481 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566526-62x7p"] Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.301976 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566526-62x7p" event={"ID":"fc55663c-b6fa-419c-a9a3-2f4234b8f27d","Type":"ContainerStarted","Data":"232fd8e3768b721b5ea4069a76ac338b7553af3332297b0bb3161a26a1ee4bb5"} Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.443197 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-9pjlm"] Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.444751 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-9pjlm" Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.450219 4749 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-hvnxk" Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.450521 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.450851 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.454958 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-9pjlm"] Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.460851 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-p7gg4"] Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.461737 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-p7gg4" Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.464366 4749 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-f2dwz" Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.488478 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-p7gg4"] Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.501277 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-grns2"] Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.502100 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-grns2" Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.503677 4749 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-9jmx2" Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.505967 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-grns2"] Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.538859 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvh26\" (UniqueName: \"kubernetes.io/projected/d10c7079-87d2-41c4-acda-82bc9d8365d2-kube-api-access-dvh26\") pod \"cert-manager-cainjector-cf98fcc89-9pjlm\" (UID: \"d10c7079-87d2-41c4-acda-82bc9d8365d2\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-9pjlm" Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.538911 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6hn2\" (UniqueName: \"kubernetes.io/projected/8be44205-7bc6-4802-addc-996357e9ffd0-kube-api-access-j6hn2\") pod \"cert-manager-webhook-687f57d79b-grns2\" (UID: \"8be44205-7bc6-4802-addc-996357e9ffd0\") " pod="cert-manager/cert-manager-webhook-687f57d79b-grns2" Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.539028 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76495\" (UniqueName: \"kubernetes.io/projected/41f8fcf3-85bc-4ff0-926c-857f426fa501-kube-api-access-76495\") pod \"cert-manager-858654f9db-p7gg4\" (UID: \"41f8fcf3-85bc-4ff0-926c-857f426fa501\") " pod="cert-manager/cert-manager-858654f9db-p7gg4" Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.639808 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvh26\" (UniqueName: \"kubernetes.io/projected/d10c7079-87d2-41c4-acda-82bc9d8365d2-kube-api-access-dvh26\") pod \"cert-manager-cainjector-cf98fcc89-9pjlm\" (UID: \"d10c7079-87d2-41c4-acda-82bc9d8365d2\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-9pjlm" Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.639852 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6hn2\" (UniqueName: \"kubernetes.io/projected/8be44205-7bc6-4802-addc-996357e9ffd0-kube-api-access-j6hn2\") pod \"cert-manager-webhook-687f57d79b-grns2\" (UID: \"8be44205-7bc6-4802-addc-996357e9ffd0\") " pod="cert-manager/cert-manager-webhook-687f57d79b-grns2" Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.639904 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76495\" (UniqueName: \"kubernetes.io/projected/41f8fcf3-85bc-4ff0-926c-857f426fa501-kube-api-access-76495\") pod \"cert-manager-858654f9db-p7gg4\" (UID: \"41f8fcf3-85bc-4ff0-926c-857f426fa501\") " pod="cert-manager/cert-manager-858654f9db-p7gg4" Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.660781 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvh26\" (UniqueName: \"kubernetes.io/projected/d10c7079-87d2-41c4-acda-82bc9d8365d2-kube-api-access-dvh26\") pod \"cert-manager-cainjector-cf98fcc89-9pjlm\" (UID: \"d10c7079-87d2-41c4-acda-82bc9d8365d2\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-9pjlm" Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.661900 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76495\" (UniqueName: \"kubernetes.io/projected/41f8fcf3-85bc-4ff0-926c-857f426fa501-kube-api-access-76495\") pod \"cert-manager-858654f9db-p7gg4\" (UID: \"41f8fcf3-85bc-4ff0-926c-857f426fa501\") " pod="cert-manager/cert-manager-858654f9db-p7gg4" Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.668364 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6hn2\" (UniqueName: \"kubernetes.io/projected/8be44205-7bc6-4802-addc-996357e9ffd0-kube-api-access-j6hn2\") pod \"cert-manager-webhook-687f57d79b-grns2\" (UID: \"8be44205-7bc6-4802-addc-996357e9ffd0\") " pod="cert-manager/cert-manager-webhook-687f57d79b-grns2" Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.771463 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-9pjlm" Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.780826 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-p7gg4" Mar 20 07:26:01 crc kubenswrapper[4749]: I0320 07:26:01.818063 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-grns2" Mar 20 07:26:02 crc kubenswrapper[4749]: I0320 07:26:02.024090 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-9pjlm"] Mar 20 07:26:02 crc kubenswrapper[4749]: I0320 07:26:02.311988 4749 generic.go:334] "Generic (PLEG): container finished" podID="fc55663c-b6fa-419c-a9a3-2f4234b8f27d" containerID="fcaf7400616e181e809747406d4a18533122f17f08dbf981631d0288f4bec979" exitCode=0 Mar 20 07:26:02 crc kubenswrapper[4749]: I0320 07:26:02.312114 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566526-62x7p" event={"ID":"fc55663c-b6fa-419c-a9a3-2f4234b8f27d","Type":"ContainerDied","Data":"fcaf7400616e181e809747406d4a18533122f17f08dbf981631d0288f4bec979"} Mar 20 07:26:02 crc kubenswrapper[4749]: I0320 07:26:02.319774 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-9pjlm" event={"ID":"d10c7079-87d2-41c4-acda-82bc9d8365d2","Type":"ContainerStarted","Data":"a43dff323fd7ab6dd07421aa732569dd7320368521b23d1b63c6ac7f117f92cf"} Mar 20 07:26:02 crc kubenswrapper[4749]: I0320 07:26:02.323130 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-grns2"] Mar 20 07:26:02 crc kubenswrapper[4749]: I0320 07:26:02.345883 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-p7gg4"] Mar 20 07:26:03 crc kubenswrapper[4749]: I0320 07:26:03.328214 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-grns2" event={"ID":"8be44205-7bc6-4802-addc-996357e9ffd0","Type":"ContainerStarted","Data":"cd3a1762763ce5cb03db9629b904ed0ac7e62f5caf69526a82946cfce3eb60aa"} Mar 20 07:26:03 crc kubenswrapper[4749]: I0320 07:26:03.330810 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-p7gg4" event={"ID":"41f8fcf3-85bc-4ff0-926c-857f426fa501","Type":"ContainerStarted","Data":"5df875024b41261b1d9613b480d0672de6d0c749152ff913abd556752d541dbb"} Mar 20 07:26:03 crc kubenswrapper[4749]: I0320 07:26:03.667143 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566526-62x7p" Mar 20 07:26:03 crc kubenswrapper[4749]: I0320 07:26:03.770481 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ngvp\" (UniqueName: \"kubernetes.io/projected/fc55663c-b6fa-419c-a9a3-2f4234b8f27d-kube-api-access-8ngvp\") pod \"fc55663c-b6fa-419c-a9a3-2f4234b8f27d\" (UID: \"fc55663c-b6fa-419c-a9a3-2f4234b8f27d\") " Mar 20 07:26:03 crc kubenswrapper[4749]: I0320 07:26:03.781472 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc55663c-b6fa-419c-a9a3-2f4234b8f27d-kube-api-access-8ngvp" (OuterVolumeSpecName: "kube-api-access-8ngvp") pod "fc55663c-b6fa-419c-a9a3-2f4234b8f27d" (UID: "fc55663c-b6fa-419c-a9a3-2f4234b8f27d"). InnerVolumeSpecName "kube-api-access-8ngvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:26:03 crc kubenswrapper[4749]: I0320 07:26:03.872147 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ngvp\" (UniqueName: \"kubernetes.io/projected/fc55663c-b6fa-419c-a9a3-2f4234b8f27d-kube-api-access-8ngvp\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:04 crc kubenswrapper[4749]: I0320 07:26:04.340622 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566526-62x7p" event={"ID":"fc55663c-b6fa-419c-a9a3-2f4234b8f27d","Type":"ContainerDied","Data":"232fd8e3768b721b5ea4069a76ac338b7553af3332297b0bb3161a26a1ee4bb5"} Mar 20 07:26:04 crc kubenswrapper[4749]: I0320 07:26:04.340665 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="232fd8e3768b721b5ea4069a76ac338b7553af3332297b0bb3161a26a1ee4bb5" Mar 20 07:26:04 crc kubenswrapper[4749]: I0320 07:26:04.340668 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566526-62x7p" Mar 20 07:26:04 crc kubenswrapper[4749]: I0320 07:26:04.514681 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:26:04 crc kubenswrapper[4749]: I0320 07:26:04.514934 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:26:04 crc kubenswrapper[4749]: I0320 07:26:04.514974 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:26:04 crc kubenswrapper[4749]: I0320 07:26:04.515514 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5e762cc7631bdd3af893f7f6b529361ab634432f285e3c9f01638e40b5f29d64"} pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 07:26:04 crc kubenswrapper[4749]: I0320 07:26:04.515588 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" containerID="cri-o://5e762cc7631bdd3af893f7f6b529361ab634432f285e3c9f01638e40b5f29d64" gracePeriod=600 Mar 20 07:26:04 crc kubenswrapper[4749]: E0320 07:26:04.621093 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12151228_1cb9_4086_9a62_f4a9583f5f69.slice/crio-5e762cc7631bdd3af893f7f6b529361ab634432f285e3c9f01638e40b5f29d64.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12151228_1cb9_4086_9a62_f4a9583f5f69.slice/crio-conmon-5e762cc7631bdd3af893f7f6b529361ab634432f285e3c9f01638e40b5f29d64.scope\": RecentStats: unable to find data in memory cache]" Mar 20 07:26:04 crc kubenswrapper[4749]: I0320 07:26:04.724389 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566520-j7gn9"] Mar 20 07:26:04 crc kubenswrapper[4749]: I0320 07:26:04.727421 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566520-j7gn9"] Mar 20 07:26:05 crc kubenswrapper[4749]: I0320 07:26:05.350872 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-p7gg4" event={"ID":"41f8fcf3-85bc-4ff0-926c-857f426fa501","Type":"ContainerStarted","Data":"23c63c5d004159ec18825b869a25357d635d28541388561153c3e8454f9cfd8c"} Mar 20 07:26:05 crc kubenswrapper[4749]: I0320 07:26:05.357646 4749 generic.go:334] "Generic (PLEG): container finished" podID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerID="5e762cc7631bdd3af893f7f6b529361ab634432f285e3c9f01638e40b5f29d64" exitCode=0 Mar 20 07:26:05 crc kubenswrapper[4749]: I0320 07:26:05.357751 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerDied","Data":"5e762cc7631bdd3af893f7f6b529361ab634432f285e3c9f01638e40b5f29d64"} Mar 20 07:26:05 crc kubenswrapper[4749]: I0320 07:26:05.357811 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerStarted","Data":"3c74897c54ef7454cef1084b8e06312bda867ecfea849b2a4ba3d53fa61618a4"} Mar 20 07:26:05 crc kubenswrapper[4749]: I0320 07:26:05.357838 4749 scope.go:117] "RemoveContainer" containerID="2b2a2005813627d86d64d4f38a80b5429cc2b61afa4394e9316e568f5d812e7d" Mar 20 07:26:05 crc kubenswrapper[4749]: I0320 07:26:05.363376 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-9pjlm" event={"ID":"d10c7079-87d2-41c4-acda-82bc9d8365d2","Type":"ContainerStarted","Data":"b548b6ca0722f2bcba2240e5f73de3b946efd3219c7f85c79958b8e0c52e115f"} Mar 20 07:26:05 crc kubenswrapper[4749]: I0320 07:26:05.370624 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-p7gg4" podStartSLOduration=1.544899711 podStartE2EDuration="4.370600675s" podCreationTimestamp="2026-03-20 07:26:01 +0000 UTC" firstStartedPulling="2026-03-20 07:26:02.36180476 +0000 UTC m=+798.911462437" lastFinishedPulling="2026-03-20 07:26:05.187505754 +0000 UTC m=+801.737163401" observedRunningTime="2026-03-20 07:26:05.369439377 +0000 UTC m=+801.919097044" watchObservedRunningTime="2026-03-20 07:26:05.370600675 +0000 UTC m=+801.920258312" Mar 20 07:26:06 crc kubenswrapper[4749]: I0320 07:26:06.189271 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e201ab3-7a56-4786-867f-5beef3df85b8" path="/var/lib/kubelet/pods/2e201ab3-7a56-4786-867f-5beef3df85b8/volumes" Mar 20 07:26:07 crc kubenswrapper[4749]: I0320 07:26:07.381347 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-grns2" event={"ID":"8be44205-7bc6-4802-addc-996357e9ffd0","Type":"ContainerStarted","Data":"25e2eefbc5e0f5209bffcab5671bd01d51ceb960b723b790e9b648ea263fd97e"} Mar 20 07:26:07 crc kubenswrapper[4749]: I0320 07:26:07.381797 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-grns2" Mar 20 07:26:07 crc kubenswrapper[4749]: I0320 07:26:07.405912 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-grns2" podStartSLOduration=2.227141502 podStartE2EDuration="6.405886271s" podCreationTimestamp="2026-03-20 07:26:01 +0000 UTC" firstStartedPulling="2026-03-20 07:26:02.348911658 +0000 UTC m=+798.898569315" lastFinishedPulling="2026-03-20 07:26:06.527656407 +0000 UTC m=+803.077314084" observedRunningTime="2026-03-20 07:26:07.402969541 +0000 UTC m=+803.952627228" watchObservedRunningTime="2026-03-20 07:26:07.405886271 +0000 UTC m=+803.955543948" Mar 20 07:26:07 crc kubenswrapper[4749]: I0320 07:26:07.408688 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-9pjlm" podStartSLOduration=3.9550419789999998 podStartE2EDuration="6.408674179s" podCreationTimestamp="2026-03-20 07:26:01 +0000 UTC" firstStartedPulling="2026-03-20 07:26:02.034508949 +0000 UTC m=+798.584166596" lastFinishedPulling="2026-03-20 07:26:04.488141149 +0000 UTC m=+801.037798796" observedRunningTime="2026-03-20 07:26:05.406204956 +0000 UTC m=+801.955862623" watchObservedRunningTime="2026-03-20 07:26:07.408674179 +0000 UTC m=+803.958331856" Mar 20 07:26:11 crc kubenswrapper[4749]: I0320 07:26:11.619631 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tdgcw"] Mar 20 07:26:11 crc kubenswrapper[4749]: I0320 07:26:11.620712 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="ovn-controller" containerID="cri-o://e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2" gracePeriod=30 Mar 20 07:26:11 crc kubenswrapper[4749]: I0320 07:26:11.620854 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="kube-rbac-proxy-node" containerID="cri-o://e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd" gracePeriod=30 Mar 20 07:26:11 crc kubenswrapper[4749]: I0320 07:26:11.620816 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="northd" containerID="cri-o://f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be" gracePeriod=30 Mar 20 07:26:11 crc kubenswrapper[4749]: I0320 07:26:11.620960 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e" gracePeriod=30 Mar 20 07:26:11 crc kubenswrapper[4749]: I0320 07:26:11.621011 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="nbdb" containerID="cri-o://390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e" gracePeriod=30 Mar 20 07:26:11 crc kubenswrapper[4749]: I0320 07:26:11.620820 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="sbdb" containerID="cri-o://8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea" gracePeriod=30 Mar 20 07:26:11 crc kubenswrapper[4749]: I0320 07:26:11.620885 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="ovn-acl-logging" containerID="cri-o://adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189" gracePeriod=30 Mar 20 07:26:11 crc kubenswrapper[4749]: I0320 07:26:11.675253 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="ovnkube-controller" containerID="cri-o://42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84" gracePeriod=30 Mar 20 07:26:11 crc kubenswrapper[4749]: I0320 07:26:11.822884 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-grns2" Mar 20 07:26:11 crc kubenswrapper[4749]: I0320 07:26:11.963763 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdgcw_2153d97b-a108-49f8-b6c8-8223ea65b878/ovnkube-controller/3.log" Mar 20 07:26:11 crc kubenswrapper[4749]: I0320 07:26:11.967518 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdgcw_2153d97b-a108-49f8-b6c8-8223ea65b878/ovn-acl-logging/0.log" Mar 20 07:26:11 crc kubenswrapper[4749]: I0320 07:26:11.968203 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdgcw_2153d97b-a108-49f8-b6c8-8223ea65b878/ovn-controller/0.log" Mar 20 07:26:11 crc kubenswrapper[4749]: I0320 07:26:11.968951 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:26:11 crc kubenswrapper[4749]: I0320 07:26:11.999509 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2153d97b-a108-49f8-b6c8-8223ea65b878-ovnkube-script-lib\") pod \"2153d97b-a108-49f8-b6c8-8223ea65b878\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " Mar 20 07:26:11 crc kubenswrapper[4749]: I0320 07:26:11.999587 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-run-systemd\") pod \"2153d97b-a108-49f8-b6c8-8223ea65b878\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " Mar 20 07:26:11 crc kubenswrapper[4749]: I0320 07:26:11.999632 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-cni-bin\") pod \"2153d97b-a108-49f8-b6c8-8223ea65b878\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " Mar 20 07:26:11 crc kubenswrapper[4749]: I0320 07:26:11.999695 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2153d97b-a108-49f8-b6c8-8223ea65b878-ovnkube-config\") pod \"2153d97b-a108-49f8-b6c8-8223ea65b878\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " Mar 20 07:26:11 crc kubenswrapper[4749]: I0320 07:26:11.999744 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2153d97b-a108-49f8-b6c8-8223ea65b878-ovn-node-metrics-cert\") pod \"2153d97b-a108-49f8-b6c8-8223ea65b878\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " Mar 20 07:26:11 crc kubenswrapper[4749]: I0320 07:26:11.999775 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "2153d97b-a108-49f8-b6c8-8223ea65b878" (UID: "2153d97b-a108-49f8-b6c8-8223ea65b878"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:26:11 crc kubenswrapper[4749]: I0320 07:26:11.999811 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-var-lib-cni-networks-ovn-kubernetes\") pod \"2153d97b-a108-49f8-b6c8-8223ea65b878\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " Mar 20 07:26:11 crc kubenswrapper[4749]: I0320 07:26:11.999861 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-run-ovn\") pod \"2153d97b-a108-49f8-b6c8-8223ea65b878\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " Mar 20 07:26:11 crc kubenswrapper[4749]: I0320 07:26:11.999897 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-kubelet\") pod \"2153d97b-a108-49f8-b6c8-8223ea65b878\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:11.999933 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-run-netns\") pod \"2153d97b-a108-49f8-b6c8-8223ea65b878\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:11.999984 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-node-log\") pod \"2153d97b-a108-49f8-b6c8-8223ea65b878\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.000017 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77vkc\" (UniqueName: \"kubernetes.io/projected/2153d97b-a108-49f8-b6c8-8223ea65b878-kube-api-access-77vkc\") pod \"2153d97b-a108-49f8-b6c8-8223ea65b878\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.000092 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-cni-netd\") pod \"2153d97b-a108-49f8-b6c8-8223ea65b878\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.000139 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-log-socket\") pod \"2153d97b-a108-49f8-b6c8-8223ea65b878\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.000193 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-systemd-units\") pod \"2153d97b-a108-49f8-b6c8-8223ea65b878\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.000267 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-slash\") pod \"2153d97b-a108-49f8-b6c8-8223ea65b878\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.000365 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-var-lib-openvswitch\") pod \"2153d97b-a108-49f8-b6c8-8223ea65b878\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.000423 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2153d97b-a108-49f8-b6c8-8223ea65b878-env-overrides\") pod \"2153d97b-a108-49f8-b6c8-8223ea65b878\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.000454 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-etc-openvswitch\") pod \"2153d97b-a108-49f8-b6c8-8223ea65b878\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.000534 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-run-ovn-kubernetes\") pod \"2153d97b-a108-49f8-b6c8-8223ea65b878\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.000562 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2153d97b-a108-49f8-b6c8-8223ea65b878-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "2153d97b-a108-49f8-b6c8-8223ea65b878" (UID: "2153d97b-a108-49f8-b6c8-8223ea65b878"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.000568 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-run-openvswitch\") pod \"2153d97b-a108-49f8-b6c8-8223ea65b878\" (UID: \"2153d97b-a108-49f8-b6c8-8223ea65b878\") " Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.000596 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "2153d97b-a108-49f8-b6c8-8223ea65b878" (UID: "2153d97b-a108-49f8-b6c8-8223ea65b878"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.000847 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2153d97b-a108-49f8-b6c8-8223ea65b878-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "2153d97b-a108-49f8-b6c8-8223ea65b878" (UID: "2153d97b-a108-49f8-b6c8-8223ea65b878"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.000969 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "2153d97b-a108-49f8-b6c8-8223ea65b878" (UID: "2153d97b-a108-49f8-b6c8-8223ea65b878"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.000998 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "2153d97b-a108-49f8-b6c8-8223ea65b878" (UID: "2153d97b-a108-49f8-b6c8-8223ea65b878"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.001021 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "2153d97b-a108-49f8-b6c8-8223ea65b878" (UID: "2153d97b-a108-49f8-b6c8-8223ea65b878"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.001051 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "2153d97b-a108-49f8-b6c8-8223ea65b878" (UID: "2153d97b-a108-49f8-b6c8-8223ea65b878"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.001039 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "2153d97b-a108-49f8-b6c8-8223ea65b878" (UID: "2153d97b-a108-49f8-b6c8-8223ea65b878"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.001071 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-log-socket" (OuterVolumeSpecName: "log-socket") pod "2153d97b-a108-49f8-b6c8-8223ea65b878" (UID: "2153d97b-a108-49f8-b6c8-8223ea65b878"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.001066 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-node-log" (OuterVolumeSpecName: "node-log") pod "2153d97b-a108-49f8-b6c8-8223ea65b878" (UID: "2153d97b-a108-49f8-b6c8-8223ea65b878"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.001105 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "2153d97b-a108-49f8-b6c8-8223ea65b878" (UID: "2153d97b-a108-49f8-b6c8-8223ea65b878"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.001117 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-slash" (OuterVolumeSpecName: "host-slash") pod "2153d97b-a108-49f8-b6c8-8223ea65b878" (UID: "2153d97b-a108-49f8-b6c8-8223ea65b878"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.001156 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "2153d97b-a108-49f8-b6c8-8223ea65b878" (UID: "2153d97b-a108-49f8-b6c8-8223ea65b878"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.001227 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "2153d97b-a108-49f8-b6c8-8223ea65b878" (UID: "2153d97b-a108-49f8-b6c8-8223ea65b878"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.001499 4749 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.001517 4749 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.001527 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2153d97b-a108-49f8-b6c8-8223ea65b878-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.001539 4749 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.001551 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2153d97b-a108-49f8-b6c8-8223ea65b878-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.001565 4749 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.001577 4749 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.001588 4749 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.001599 4749 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.001607 4749 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.001643 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2153d97b-a108-49f8-b6c8-8223ea65b878-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "2153d97b-a108-49f8-b6c8-8223ea65b878" (UID: "2153d97b-a108-49f8-b6c8-8223ea65b878"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.001710 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "2153d97b-a108-49f8-b6c8-8223ea65b878" (UID: "2153d97b-a108-49f8-b6c8-8223ea65b878"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.010216 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2153d97b-a108-49f8-b6c8-8223ea65b878-kube-api-access-77vkc" (OuterVolumeSpecName: "kube-api-access-77vkc") pod "2153d97b-a108-49f8-b6c8-8223ea65b878" (UID: "2153d97b-a108-49f8-b6c8-8223ea65b878"). InnerVolumeSpecName "kube-api-access-77vkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.010308 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2153d97b-a108-49f8-b6c8-8223ea65b878-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "2153d97b-a108-49f8-b6c8-8223ea65b878" (UID: "2153d97b-a108-49f8-b6c8-8223ea65b878"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.038902 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "2153d97b-a108-49f8-b6c8-8223ea65b878" (UID: "2153d97b-a108-49f8-b6c8-8223ea65b878"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.042493 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-n58rp"] Mar 20 07:26:12 crc kubenswrapper[4749]: E0320 07:26:12.042863 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="northd" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.042898 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="northd" Mar 20 07:26:12 crc kubenswrapper[4749]: E0320 07:26:12.042919 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="kubecfg-setup" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.042933 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="kubecfg-setup" Mar 20 07:26:12 crc kubenswrapper[4749]: E0320 07:26:12.042956 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="ovn-controller" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.042971 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="ovn-controller" Mar 20 07:26:12 crc kubenswrapper[4749]: E0320 07:26:12.042987 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="ovnkube-controller" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.043000 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="ovnkube-controller" Mar 20 07:26:12 crc kubenswrapper[4749]: E0320 07:26:12.043021 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc55663c-b6fa-419c-a9a3-2f4234b8f27d" containerName="oc" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.043034 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc55663c-b6fa-419c-a9a3-2f4234b8f27d" containerName="oc" Mar 20 07:26:12 crc kubenswrapper[4749]: E0320 07:26:12.043051 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.043064 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 07:26:12 crc kubenswrapper[4749]: E0320 07:26:12.043082 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="nbdb" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.043095 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="nbdb" Mar 20 07:26:12 crc kubenswrapper[4749]: E0320 07:26:12.043118 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="ovn-acl-logging" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.043131 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="ovn-acl-logging" Mar 20 07:26:12 crc kubenswrapper[4749]: E0320 07:26:12.043146 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="ovnkube-controller" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.043159 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="ovnkube-controller" Mar 20 07:26:12 crc kubenswrapper[4749]: E0320 07:26:12.043172 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="sbdb" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.043184 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="sbdb" Mar 20 07:26:12 crc kubenswrapper[4749]: E0320 07:26:12.043210 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="kube-rbac-proxy-node" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.043224 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="kube-rbac-proxy-node" Mar 20 07:26:12 crc kubenswrapper[4749]: E0320 07:26:12.043244 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="ovnkube-controller" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.043258 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="ovnkube-controller" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.043465 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="ovn-controller" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.043486 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="sbdb" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.043507 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="nbdb" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.043525 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.043543 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc55663c-b6fa-419c-a9a3-2f4234b8f27d" containerName="oc" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.043560 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="kube-rbac-proxy-node" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.043582 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="northd" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.043600 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="ovnkube-controller" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.043615 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="ovn-acl-logging" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.043684 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="ovnkube-controller" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.043769 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="ovnkube-controller" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.043798 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="ovnkube-controller" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.043816 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="ovnkube-controller" Mar 20 07:26:12 crc kubenswrapper[4749]: E0320 07:26:12.043993 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="ovnkube-controller" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.044019 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="ovnkube-controller" Mar 20 07:26:12 crc kubenswrapper[4749]: E0320 07:26:12.044051 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="ovnkube-controller" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.044064 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerName="ovnkube-controller" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.047644 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.102691 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-host-slash\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.102942 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-run-systemd\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.103067 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-var-lib-openvswitch\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.103162 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-host-run-netns\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.103252 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-systemd-units\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.103375 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ca01ee6-13a5-4472-a302-f58e21d445d7-env-overrides\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.103470 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-host-cni-bin\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.103577 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-run-openvswitch\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.103676 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-host-run-ovn-kubernetes\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.103882 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ca01ee6-13a5-4472-a302-f58e21d445d7-ovnkube-script-lib\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.103960 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-host-kubelet\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.104041 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-run-ovn\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.104095 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-etc-openvswitch\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.104138 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-node-log\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.104238 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ca01ee6-13a5-4472-a302-f58e21d445d7-ovnkube-config\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.104313 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-host-cni-netd\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.104361 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-log-socket\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.104394 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzlkw\" (UniqueName: \"kubernetes.io/projected/6ca01ee6-13a5-4472-a302-f58e21d445d7-kube-api-access-fzlkw\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.104453 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.104491 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ca01ee6-13a5-4472-a302-f58e21d445d7-ovn-node-metrics-cert\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.104637 4749 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-log-socket\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.104687 4749 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-slash\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.104715 4749 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.104762 4749 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2153d97b-a108-49f8-b6c8-8223ea65b878-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.104792 4749 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.104818 4749 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.104843 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2153d97b-a108-49f8-b6c8-8223ea65b878-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.104872 4749 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.104915 4749 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2153d97b-a108-49f8-b6c8-8223ea65b878-node-log\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.104953 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77vkc\" (UniqueName: \"kubernetes.io/projected/2153d97b-a108-49f8-b6c8-8223ea65b878-kube-api-access-77vkc\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.205867 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzlkw\" (UniqueName: \"kubernetes.io/projected/6ca01ee6-13a5-4472-a302-f58e21d445d7-kube-api-access-fzlkw\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.205928 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-log-socket\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.205953 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.205973 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ca01ee6-13a5-4472-a302-f58e21d445d7-ovn-node-metrics-cert\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.205993 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-host-slash\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206008 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-run-systemd\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206023 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-var-lib-openvswitch\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206026 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-log-socket\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206046 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206074 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-host-run-netns\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206042 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-host-run-netns\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206095 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-run-systemd\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206119 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-systemd-units\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206138 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-systemd-units\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206142 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ca01ee6-13a5-4472-a302-f58e21d445d7-env-overrides\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206164 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-var-lib-openvswitch\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206166 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-host-cni-bin\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206183 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-host-cni-bin\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206122 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-host-slash\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206198 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-run-openvswitch\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206225 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-host-run-ovn-kubernetes\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206253 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ca01ee6-13a5-4472-a302-f58e21d445d7-ovnkube-script-lib\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206269 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-host-kubelet\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206301 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-run-ovn\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206269 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-run-openvswitch\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206320 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-etc-openvswitch\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206344 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-etc-openvswitch\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206358 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-node-log\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206429 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ca01ee6-13a5-4472-a302-f58e21d445d7-ovnkube-config\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206459 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-host-cni-netd\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206565 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-host-cni-netd\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206600 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-node-log\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206367 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-host-run-ovn-kubernetes\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206857 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ca01ee6-13a5-4472-a302-f58e21d445d7-env-overrides\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206911 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-host-kubelet\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206942 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6ca01ee6-13a5-4472-a302-f58e21d445d7-run-ovn\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.206967 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ca01ee6-13a5-4472-a302-f58e21d445d7-ovnkube-script-lib\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.207331 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ca01ee6-13a5-4472-a302-f58e21d445d7-ovnkube-config\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.209207 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ca01ee6-13a5-4472-a302-f58e21d445d7-ovn-node-metrics-cert\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.221457 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzlkw\" (UniqueName: \"kubernetes.io/projected/6ca01ee6-13a5-4472-a302-f58e21d445d7-kube-api-access-fzlkw\") pod \"ovnkube-node-n58rp\" (UID: \"6ca01ee6-13a5-4472-a302-f58e21d445d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.375455 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:12 crc kubenswrapper[4749]: W0320 07:26:12.406072 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ca01ee6_13a5_4472_a302_f58e21d445d7.slice/crio-b8fe9af9edc90dde92afaffeae7cd6f2a9cc803fccd19be7008349f93f3e9cb9 WatchSource:0}: Error finding container b8fe9af9edc90dde92afaffeae7cd6f2a9cc803fccd19be7008349f93f3e9cb9: Status 404 returned error can't find the container with id b8fe9af9edc90dde92afaffeae7cd6f2a9cc803fccd19be7008349f93f3e9cb9 Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.420393 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rcq9v_3f813da7-84d4-4550-ad66-f282814444a3/kube-multus/2.log" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.420976 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rcq9v_3f813da7-84d4-4550-ad66-f282814444a3/kube-multus/1.log" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.421035 4749 generic.go:334] "Generic (PLEG): container finished" podID="3f813da7-84d4-4550-ad66-f282814444a3" containerID="a472c3325b9b11a217ab5fe9ec06f916f27c82ae9b673a67e50a89cf56598aeb" exitCode=2 Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.421166 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rcq9v" event={"ID":"3f813da7-84d4-4550-ad66-f282814444a3","Type":"ContainerDied","Data":"a472c3325b9b11a217ab5fe9ec06f916f27c82ae9b673a67e50a89cf56598aeb"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.421225 4749 scope.go:117] "RemoveContainer" containerID="290c8178fc52bf0ce040051ac3f6e31f5f5245203c3a61c98c6a723710fbb94b" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.422085 4749 scope.go:117] "RemoveContainer" containerID="a472c3325b9b11a217ab5fe9ec06f916f27c82ae9b673a67e50a89cf56598aeb" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.424459 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" event={"ID":"6ca01ee6-13a5-4472-a302-f58e21d445d7","Type":"ContainerStarted","Data":"b8fe9af9edc90dde92afaffeae7cd6f2a9cc803fccd19be7008349f93f3e9cb9"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.427042 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdgcw_2153d97b-a108-49f8-b6c8-8223ea65b878/ovnkube-controller/3.log" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.434643 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdgcw_2153d97b-a108-49f8-b6c8-8223ea65b878/ovn-acl-logging/0.log" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.435920 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tdgcw_2153d97b-a108-49f8-b6c8-8223ea65b878/ovn-controller/0.log" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437116 4749 generic.go:334] "Generic (PLEG): container finished" podID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerID="42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84" exitCode=0 Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437156 4749 generic.go:334] "Generic (PLEG): container finished" podID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerID="8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea" exitCode=0 Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437174 4749 generic.go:334] "Generic (PLEG): container finished" podID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerID="390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e" exitCode=0 Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437187 4749 generic.go:334] "Generic (PLEG): container finished" podID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerID="f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be" exitCode=0 Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437191 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437199 4749 generic.go:334] "Generic (PLEG): container finished" podID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerID="ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e" exitCode=0 Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437212 4749 generic.go:334] "Generic (PLEG): container finished" podID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerID="e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd" exitCode=0 Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437223 4749 generic.go:334] "Generic (PLEG): container finished" podID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerID="adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189" exitCode=143 Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437236 4749 generic.go:334] "Generic (PLEG): container finished" podID="2153d97b-a108-49f8-b6c8-8223ea65b878" containerID="e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2" exitCode=143 Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437309 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerDied","Data":"42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437378 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerDied","Data":"8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437404 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerDied","Data":"390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437424 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerDied","Data":"f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437444 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerDied","Data":"ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437466 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerDied","Data":"e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437484 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437502 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437514 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437525 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437536 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437547 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437558 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437568 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437579 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437589 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437603 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerDied","Data":"adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437619 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437632 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437646 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437661 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437674 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437692 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437707 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437720 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437967 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437981 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.437998 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerDied","Data":"e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.438046 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.438063 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.438074 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.438085 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.438095 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.438105 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.438116 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.438126 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.438136 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.438146 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.438161 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tdgcw" event={"ID":"2153d97b-a108-49f8-b6c8-8223ea65b878","Type":"ContainerDied","Data":"689dbeb0340cfee1fccb56ef27e4c0b4ce438ecf525e95c1da70ea2bc9629731"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.438179 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.438193 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.438203 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.438213 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.438223 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.438361 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.438388 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.438429 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.438446 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.438457 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc"} Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.456244 4749 scope.go:117] "RemoveContainer" containerID="42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.475050 4749 scope.go:117] "RemoveContainer" containerID="b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.477828 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tdgcw"] Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.486069 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tdgcw"] Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.506228 4749 scope.go:117] "RemoveContainer" containerID="8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.524662 4749 scope.go:117] "RemoveContainer" containerID="390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.546544 4749 scope.go:117] "RemoveContainer" containerID="f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.566524 4749 scope.go:117] "RemoveContainer" containerID="ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.622512 4749 scope.go:117] "RemoveContainer" containerID="e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.644195 4749 scope.go:117] "RemoveContainer" containerID="adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.664264 4749 scope.go:117] "RemoveContainer" containerID="e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.688731 4749 scope.go:117] "RemoveContainer" containerID="0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.706222 4749 scope.go:117] "RemoveContainer" containerID="42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84" Mar 20 07:26:12 crc kubenswrapper[4749]: E0320 07:26:12.706940 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84\": container with ID starting with 42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84 not found: ID does not exist" containerID="42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.706982 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84"} err="failed to get container status \"42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84\": rpc error: code = NotFound desc = could not find container \"42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84\": container with ID starting with 42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84 not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.707006 4749 scope.go:117] "RemoveContainer" containerID="b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9" Mar 20 07:26:12 crc kubenswrapper[4749]: E0320 07:26:12.707544 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9\": container with ID starting with b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9 not found: ID does not exist" containerID="b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.707575 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9"} err="failed to get container status \"b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9\": rpc error: code = NotFound desc = could not find container \"b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9\": container with ID starting with b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9 not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.707591 4749 scope.go:117] "RemoveContainer" containerID="8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea" Mar 20 07:26:12 crc kubenswrapper[4749]: E0320 07:26:12.710988 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\": container with ID starting with 8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea not found: ID does not exist" containerID="8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.711017 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea"} err="failed to get container status \"8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\": rpc error: code = NotFound desc = could not find container \"8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\": container with ID starting with 8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.711038 4749 scope.go:117] "RemoveContainer" containerID="390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e" Mar 20 07:26:12 crc kubenswrapper[4749]: E0320 07:26:12.711667 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\": container with ID starting with 390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e not found: ID does not exist" containerID="390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.711709 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e"} err="failed to get container status \"390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\": rpc error: code = NotFound desc = could not find container \"390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\": container with ID starting with 390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.711736 4749 scope.go:117] "RemoveContainer" containerID="f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be" Mar 20 07:26:12 crc kubenswrapper[4749]: E0320 07:26:12.712087 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\": container with ID starting with f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be not found: ID does not exist" containerID="f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.712115 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be"} err="failed to get container status \"f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\": rpc error: code = NotFound desc = could not find container \"f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\": container with ID starting with f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.712132 4749 scope.go:117] "RemoveContainer" containerID="ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e" Mar 20 07:26:12 crc kubenswrapper[4749]: E0320 07:26:12.712502 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\": container with ID starting with ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e not found: ID does not exist" containerID="ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.712528 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e"} err="failed to get container status \"ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\": rpc error: code = NotFound desc = could not find container \"ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\": container with ID starting with ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.712547 4749 scope.go:117] "RemoveContainer" containerID="e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd" Mar 20 07:26:12 crc kubenswrapper[4749]: E0320 07:26:12.712905 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\": container with ID starting with e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd not found: ID does not exist" containerID="e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.712926 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd"} err="failed to get container status \"e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\": rpc error: code = NotFound desc = could not find container \"e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\": container with ID starting with e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.712970 4749 scope.go:117] "RemoveContainer" containerID="adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189" Mar 20 07:26:12 crc kubenswrapper[4749]: E0320 07:26:12.713418 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\": container with ID starting with adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189 not found: ID does not exist" containerID="adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.713444 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189"} err="failed to get container status \"adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\": rpc error: code = NotFound desc = could not find container \"adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\": container with ID starting with adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189 not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.713460 4749 scope.go:117] "RemoveContainer" containerID="e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2" Mar 20 07:26:12 crc kubenswrapper[4749]: E0320 07:26:12.713810 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\": container with ID starting with e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2 not found: ID does not exist" containerID="e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.713850 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2"} err="failed to get container status \"e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\": rpc error: code = NotFound desc = could not find container \"e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\": container with ID starting with e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2 not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.713869 4749 scope.go:117] "RemoveContainer" containerID="0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc" Mar 20 07:26:12 crc kubenswrapper[4749]: E0320 07:26:12.714376 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\": container with ID starting with 0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc not found: ID does not exist" containerID="0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.714419 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc"} err="failed to get container status \"0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\": rpc error: code = NotFound desc = could not find container \"0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\": container with ID starting with 0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.714442 4749 scope.go:117] "RemoveContainer" containerID="42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.714762 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84"} err="failed to get container status \"42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84\": rpc error: code = NotFound desc = could not find container \"42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84\": container with ID starting with 42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84 not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.714791 4749 scope.go:117] "RemoveContainer" containerID="b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.715204 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9"} err="failed to get container status \"b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9\": rpc error: code = NotFound desc = could not find container \"b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9\": container with ID starting with b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9 not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.715229 4749 scope.go:117] "RemoveContainer" containerID="8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.715536 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea"} err="failed to get container status \"8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\": rpc error: code = NotFound desc = could not find container \"8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\": container with ID starting with 8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.715562 4749 scope.go:117] "RemoveContainer" containerID="390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.715917 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e"} err="failed to get container status \"390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\": rpc error: code = NotFound desc = could not find container \"390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\": container with ID starting with 390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.715939 4749 scope.go:117] "RemoveContainer" containerID="f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.716219 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be"} err="failed to get container status \"f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\": rpc error: code = NotFound desc = could not find container \"f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\": container with ID starting with f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.716458 4749 scope.go:117] "RemoveContainer" containerID="ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.717268 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e"} err="failed to get container status \"ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\": rpc error: code = NotFound desc = could not find container \"ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\": container with ID starting with ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.717320 4749 scope.go:117] "RemoveContainer" containerID="e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.717668 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd"} err="failed to get container status \"e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\": rpc error: code = NotFound desc = could not find container \"e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\": container with ID starting with e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.717700 4749 scope.go:117] "RemoveContainer" containerID="adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.718118 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189"} err="failed to get container status \"adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\": rpc error: code = NotFound desc = could not find container \"adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\": container with ID starting with adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189 not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.718152 4749 scope.go:117] "RemoveContainer" containerID="e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.718607 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2"} err="failed to get container status \"e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\": rpc error: code = NotFound desc = could not find container \"e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\": container with ID starting with e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2 not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.718641 4749 scope.go:117] "RemoveContainer" containerID="0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.718967 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc"} err="failed to get container status \"0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\": rpc error: code = NotFound desc = could not find container \"0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\": container with ID starting with 0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.719002 4749 scope.go:117] "RemoveContainer" containerID="42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.719453 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84"} err="failed to get container status \"42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84\": rpc error: code = NotFound desc = could not find container \"42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84\": container with ID starting with 42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84 not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.719486 4749 scope.go:117] "RemoveContainer" containerID="b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.719771 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9"} err="failed to get container status \"b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9\": rpc error: code = NotFound desc = could not find container \"b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9\": container with ID starting with b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9 not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.719806 4749 scope.go:117] "RemoveContainer" containerID="8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.720145 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea"} err="failed to get container status \"8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\": rpc error: code = NotFound desc = could not find container \"8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\": container with ID starting with 8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.720180 4749 scope.go:117] "RemoveContainer" containerID="390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.720504 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e"} err="failed to get container status \"390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\": rpc error: code = NotFound desc = could not find container \"390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\": container with ID starting with 390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.720538 4749 scope.go:117] "RemoveContainer" containerID="f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.720952 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be"} err="failed to get container status \"f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\": rpc error: code = NotFound desc = could not find container \"f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\": container with ID starting with f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.720977 4749 scope.go:117] "RemoveContainer" containerID="ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.721327 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e"} err="failed to get container status \"ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\": rpc error: code = NotFound desc = could not find container \"ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\": container with ID starting with ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.721355 4749 scope.go:117] "RemoveContainer" containerID="e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.721694 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd"} err="failed to get container status \"e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\": rpc error: code = NotFound desc = could not find container \"e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\": container with ID starting with e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.721727 4749 scope.go:117] "RemoveContainer" containerID="adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.722054 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189"} err="failed to get container status \"adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\": rpc error: code = NotFound desc = could not find container \"adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\": container with ID starting with adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189 not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.722079 4749 scope.go:117] "RemoveContainer" containerID="e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.722449 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2"} err="failed to get container status \"e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\": rpc error: code = NotFound desc = could not find container \"e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\": container with ID starting with e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2 not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.722474 4749 scope.go:117] "RemoveContainer" containerID="0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.722794 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc"} err="failed to get container status \"0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\": rpc error: code = NotFound desc = could not find container \"0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\": container with ID starting with 0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.722818 4749 scope.go:117] "RemoveContainer" containerID="42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.723055 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84"} err="failed to get container status \"42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84\": rpc error: code = NotFound desc = could not find container \"42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84\": container with ID starting with 42d0d0f46d701b86a49a246c4107b89cf1500d319a478ae5e89dcd611706ce84 not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.723085 4749 scope.go:117] "RemoveContainer" containerID="b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.723553 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9"} err="failed to get container status \"b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9\": rpc error: code = NotFound desc = could not find container \"b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9\": container with ID starting with b7b48ca93159bd68739b8c9e9752d534320245059bc6d665239b5f34bf14bcc9 not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.723588 4749 scope.go:117] "RemoveContainer" containerID="8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.724037 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea"} err="failed to get container status \"8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\": rpc error: code = NotFound desc = could not find container \"8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea\": container with ID starting with 8f0b9f4c789d63dc388f87e591399a7d4de5b688917cb6f709e1730fa176a0ea not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.724069 4749 scope.go:117] "RemoveContainer" containerID="390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.724775 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e"} err="failed to get container status \"390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\": rpc error: code = NotFound desc = could not find container \"390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e\": container with ID starting with 390c963e22ced0817e5c02160c1d52e2290108f931319ccbf5c2e6818730f18e not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.724804 4749 scope.go:117] "RemoveContainer" containerID="f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.725304 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be"} err="failed to get container status \"f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\": rpc error: code = NotFound desc = could not find container \"f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be\": container with ID starting with f6b5c46e8c34c081d9b8ac7dc1622f493ce48dd2d647beca8f35404f733702be not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.725337 4749 scope.go:117] "RemoveContainer" containerID="ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.725780 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e"} err="failed to get container status \"ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\": rpc error: code = NotFound desc = could not find container \"ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e\": container with ID starting with ca9c244563a4d1715c28933608aa4df2239452b396669d622aa6fb13ad1c295e not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.725813 4749 scope.go:117] "RemoveContainer" containerID="e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.726105 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd"} err="failed to get container status \"e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\": rpc error: code = NotFound desc = could not find container \"e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd\": container with ID starting with e9aa288f44a509172b56d48970e117877a686d122bf060f1248d79686a4bfcdd not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.726139 4749 scope.go:117] "RemoveContainer" containerID="adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.726562 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189"} err="failed to get container status \"adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\": rpc error: code = NotFound desc = could not find container \"adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189\": container with ID starting with adacfd87e02577b308b0aed3be3d3574f120cf3907620889a5aa3ac78ac29189 not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.726633 4749 scope.go:117] "RemoveContainer" containerID="e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.726972 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2"} err="failed to get container status \"e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\": rpc error: code = NotFound desc = could not find container \"e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2\": container with ID starting with e4f35833e7456fe5e2cee4d584392b7b0e3ceb4541be53bc95a49e20c8bf03a2 not found: ID does not exist" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.727007 4749 scope.go:117] "RemoveContainer" containerID="0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc" Mar 20 07:26:12 crc kubenswrapper[4749]: I0320 07:26:12.727311 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc"} err="failed to get container status \"0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\": rpc error: code = NotFound desc = could not find container \"0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc\": container with ID starting with 0e4841705ccedfc3ac2abae17659ec31ffca7a486a267bb29c8c185c168c24fc not found: ID does not exist" Mar 20 07:26:13 crc kubenswrapper[4749]: I0320 07:26:13.449881 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rcq9v_3f813da7-84d4-4550-ad66-f282814444a3/kube-multus/2.log" Mar 20 07:26:13 crc kubenswrapper[4749]: I0320 07:26:13.450067 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rcq9v" event={"ID":"3f813da7-84d4-4550-ad66-f282814444a3","Type":"ContainerStarted","Data":"9b5204d1fda45db742b676c02899981bfe14a448fe72e5604325ed684c6b8e2f"} Mar 20 07:26:13 crc kubenswrapper[4749]: I0320 07:26:13.453123 4749 generic.go:334] "Generic (PLEG): container finished" podID="6ca01ee6-13a5-4472-a302-f58e21d445d7" containerID="e1da62d07318bf996a040b91a315892c91cb30c18b92c7da3915611270df918d" exitCode=0 Mar 20 07:26:13 crc kubenswrapper[4749]: I0320 07:26:13.453381 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" event={"ID":"6ca01ee6-13a5-4472-a302-f58e21d445d7","Type":"ContainerDied","Data":"e1da62d07318bf996a040b91a315892c91cb30c18b92c7da3915611270df918d"} Mar 20 07:26:14 crc kubenswrapper[4749]: I0320 07:26:14.185561 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2153d97b-a108-49f8-b6c8-8223ea65b878" path="/var/lib/kubelet/pods/2153d97b-a108-49f8-b6c8-8223ea65b878/volumes" Mar 20 07:26:14 crc kubenswrapper[4749]: I0320 07:26:14.465165 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" event={"ID":"6ca01ee6-13a5-4472-a302-f58e21d445d7","Type":"ContainerStarted","Data":"38ab5f0a73e410083d4e945aa00c41f7fe5b55eea1ca23dfebc3a41993be2f88"} Mar 20 07:26:14 crc kubenswrapper[4749]: I0320 07:26:14.465535 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" event={"ID":"6ca01ee6-13a5-4472-a302-f58e21d445d7","Type":"ContainerStarted","Data":"4e74d6c1c897862efef420aeefcb51bc5c4438ac7da1820e6af629182849e591"} Mar 20 07:26:14 crc kubenswrapper[4749]: I0320 07:26:14.465560 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" event={"ID":"6ca01ee6-13a5-4472-a302-f58e21d445d7","Type":"ContainerStarted","Data":"3388504c86ccd4b4ec60c7f8ea7c625dc27b58de4bcde2d02705541da1d46615"} Mar 20 07:26:14 crc kubenswrapper[4749]: I0320 07:26:14.465577 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" event={"ID":"6ca01ee6-13a5-4472-a302-f58e21d445d7","Type":"ContainerStarted","Data":"5211a715b489a76584b393a244558e812458c544c638b85fb4829ce9e20554fe"} Mar 20 07:26:14 crc kubenswrapper[4749]: I0320 07:26:14.465592 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" event={"ID":"6ca01ee6-13a5-4472-a302-f58e21d445d7","Type":"ContainerStarted","Data":"e784bb5da198fffb2b5c3459e36a68dd7ffe3fef04c7786d97f4c31f10eb9248"} Mar 20 07:26:14 crc kubenswrapper[4749]: I0320 07:26:14.465610 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" event={"ID":"6ca01ee6-13a5-4472-a302-f58e21d445d7","Type":"ContainerStarted","Data":"5196d6e4790ca4598b66b851290c82413c56d2f11c266b3dadf80bc07521c56d"} Mar 20 07:26:17 crc kubenswrapper[4749]: I0320 07:26:17.491481 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" event={"ID":"6ca01ee6-13a5-4472-a302-f58e21d445d7","Type":"ContainerStarted","Data":"350d12cfa80dd54fa8377e1135d5e2767c9eb6b2be52a652beadfe0a4ae44a5e"} Mar 20 07:26:19 crc kubenswrapper[4749]: I0320 07:26:19.507499 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" event={"ID":"6ca01ee6-13a5-4472-a302-f58e21d445d7","Type":"ContainerStarted","Data":"c0270edfa9a095ad7eb35ff4c26d4b181acf4a91599d511e237fb5c03772a403"} Mar 20 07:26:19 crc kubenswrapper[4749]: I0320 07:26:19.507803 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:19 crc kubenswrapper[4749]: I0320 07:26:19.547889 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" podStartSLOduration=7.547870948 podStartE2EDuration="7.547870948s" podCreationTimestamp="2026-03-20 07:26:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:26:19.544609299 +0000 UTC m=+816.094266986" watchObservedRunningTime="2026-03-20 07:26:19.547870948 +0000 UTC m=+816.097528595" Mar 20 07:26:19 crc kubenswrapper[4749]: I0320 07:26:19.570820 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:20 crc kubenswrapper[4749]: I0320 07:26:20.516650 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:20 crc kubenswrapper[4749]: I0320 07:26:20.516715 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:20 crc kubenswrapper[4749]: I0320 07:26:20.520023 4749 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 07:26:20 crc kubenswrapper[4749]: I0320 07:26:20.569640 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:42 crc kubenswrapper[4749]: I0320 07:26:42.416445 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n58rp" Mar 20 07:26:47 crc kubenswrapper[4749]: I0320 07:26:47.443814 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m896z"] Mar 20 07:26:47 crc kubenswrapper[4749]: I0320 07:26:47.446846 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m896z" Mar 20 07:26:47 crc kubenswrapper[4749]: I0320 07:26:47.466099 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m896z"] Mar 20 07:26:47 crc kubenswrapper[4749]: I0320 07:26:47.581949 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nw8v\" (UniqueName: \"kubernetes.io/projected/2194420d-f882-4caf-bdd8-8942bdaadabf-kube-api-access-6nw8v\") pod \"redhat-operators-m896z\" (UID: \"2194420d-f882-4caf-bdd8-8942bdaadabf\") " pod="openshift-marketplace/redhat-operators-m896z" Mar 20 07:26:47 crc kubenswrapper[4749]: I0320 07:26:47.582094 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2194420d-f882-4caf-bdd8-8942bdaadabf-utilities\") pod \"redhat-operators-m896z\" (UID: \"2194420d-f882-4caf-bdd8-8942bdaadabf\") " pod="openshift-marketplace/redhat-operators-m896z" Mar 20 07:26:47 crc kubenswrapper[4749]: I0320 07:26:47.582131 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2194420d-f882-4caf-bdd8-8942bdaadabf-catalog-content\") pod \"redhat-operators-m896z\" (UID: \"2194420d-f882-4caf-bdd8-8942bdaadabf\") " pod="openshift-marketplace/redhat-operators-m896z" Mar 20 07:26:47 crc kubenswrapper[4749]: I0320 07:26:47.683395 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nw8v\" (UniqueName: \"kubernetes.io/projected/2194420d-f882-4caf-bdd8-8942bdaadabf-kube-api-access-6nw8v\") pod \"redhat-operators-m896z\" (UID: \"2194420d-f882-4caf-bdd8-8942bdaadabf\") " pod="openshift-marketplace/redhat-operators-m896z" Mar 20 07:26:47 crc kubenswrapper[4749]: I0320 07:26:47.683594 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2194420d-f882-4caf-bdd8-8942bdaadabf-utilities\") pod \"redhat-operators-m896z\" (UID: \"2194420d-f882-4caf-bdd8-8942bdaadabf\") " pod="openshift-marketplace/redhat-operators-m896z" Mar 20 07:26:47 crc kubenswrapper[4749]: I0320 07:26:47.683656 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2194420d-f882-4caf-bdd8-8942bdaadabf-catalog-content\") pod \"redhat-operators-m896z\" (UID: \"2194420d-f882-4caf-bdd8-8942bdaadabf\") " pod="openshift-marketplace/redhat-operators-m896z" Mar 20 07:26:47 crc kubenswrapper[4749]: I0320 07:26:47.684216 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2194420d-f882-4caf-bdd8-8942bdaadabf-utilities\") pod \"redhat-operators-m896z\" (UID: \"2194420d-f882-4caf-bdd8-8942bdaadabf\") " pod="openshift-marketplace/redhat-operators-m896z" Mar 20 07:26:47 crc kubenswrapper[4749]: I0320 07:26:47.684470 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2194420d-f882-4caf-bdd8-8942bdaadabf-catalog-content\") pod \"redhat-operators-m896z\" (UID: \"2194420d-f882-4caf-bdd8-8942bdaadabf\") " pod="openshift-marketplace/redhat-operators-m896z" Mar 20 07:26:47 crc kubenswrapper[4749]: I0320 07:26:47.703092 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nw8v\" (UniqueName: \"kubernetes.io/projected/2194420d-f882-4caf-bdd8-8942bdaadabf-kube-api-access-6nw8v\") pod \"redhat-operators-m896z\" (UID: \"2194420d-f882-4caf-bdd8-8942bdaadabf\") " pod="openshift-marketplace/redhat-operators-m896z" Mar 20 07:26:47 crc kubenswrapper[4749]: I0320 07:26:47.811783 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m896z" Mar 20 07:26:48 crc kubenswrapper[4749]: I0320 07:26:48.053630 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m896z"] Mar 20 07:26:48 crc kubenswrapper[4749]: I0320 07:26:48.705064 4749 generic.go:334] "Generic (PLEG): container finished" podID="2194420d-f882-4caf-bdd8-8942bdaadabf" containerID="428ca903ca2c7775c7b25490756bf0ed2ba820c35461cb3886b8e9c34c599a0e" exitCode=0 Mar 20 07:26:48 crc kubenswrapper[4749]: I0320 07:26:48.705151 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m896z" event={"ID":"2194420d-f882-4caf-bdd8-8942bdaadabf","Type":"ContainerDied","Data":"428ca903ca2c7775c7b25490756bf0ed2ba820c35461cb3886b8e9c34c599a0e"} Mar 20 07:26:48 crc kubenswrapper[4749]: I0320 07:26:48.705347 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m896z" event={"ID":"2194420d-f882-4caf-bdd8-8942bdaadabf","Type":"ContainerStarted","Data":"ea0e95e16d958d60ca6f067e22692708a1609f2d9069208163b81cd940f9dc7b"} Mar 20 07:26:49 crc kubenswrapper[4749]: I0320 07:26:49.716648 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m896z" event={"ID":"2194420d-f882-4caf-bdd8-8942bdaadabf","Type":"ContainerStarted","Data":"cd73b990aaca9c5ffe1a74b128e626f81e1c43e00b29a52136c777433ef05d00"} Mar 20 07:26:50 crc kubenswrapper[4749]: I0320 07:26:50.726759 4749 generic.go:334] "Generic (PLEG): container finished" podID="2194420d-f882-4caf-bdd8-8942bdaadabf" containerID="cd73b990aaca9c5ffe1a74b128e626f81e1c43e00b29a52136c777433ef05d00" exitCode=0 Mar 20 07:26:50 crc kubenswrapper[4749]: I0320 07:26:50.726851 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m896z" event={"ID":"2194420d-f882-4caf-bdd8-8942bdaadabf","Type":"ContainerDied","Data":"cd73b990aaca9c5ffe1a74b128e626f81e1c43e00b29a52136c777433ef05d00"} Mar 20 07:26:51 crc kubenswrapper[4749]: I0320 07:26:51.125163 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld"] Mar 20 07:26:51 crc kubenswrapper[4749]: I0320 07:26:51.134974 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld" Mar 20 07:26:51 crc kubenswrapper[4749]: I0320 07:26:51.137813 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 07:26:51 crc kubenswrapper[4749]: I0320 07:26:51.155430 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld"] Mar 20 07:26:51 crc kubenswrapper[4749]: I0320 07:26:51.231963 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68f28f80-0a90-4e42-ac6e-66fff5eec59a-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld\" (UID: \"68f28f80-0a90-4e42-ac6e-66fff5eec59a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld" Mar 20 07:26:51 crc kubenswrapper[4749]: I0320 07:26:51.232042 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68f28f80-0a90-4e42-ac6e-66fff5eec59a-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld\" (UID: \"68f28f80-0a90-4e42-ac6e-66fff5eec59a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld" Mar 20 07:26:51 crc kubenswrapper[4749]: I0320 07:26:51.232133 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvls9\" (UniqueName: \"kubernetes.io/projected/68f28f80-0a90-4e42-ac6e-66fff5eec59a-kube-api-access-mvls9\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld\" (UID: \"68f28f80-0a90-4e42-ac6e-66fff5eec59a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld" Mar 20 07:26:51 crc kubenswrapper[4749]: I0320 07:26:51.333881 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68f28f80-0a90-4e42-ac6e-66fff5eec59a-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld\" (UID: \"68f28f80-0a90-4e42-ac6e-66fff5eec59a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld" Mar 20 07:26:51 crc kubenswrapper[4749]: I0320 07:26:51.334045 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68f28f80-0a90-4e42-ac6e-66fff5eec59a-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld\" (UID: \"68f28f80-0a90-4e42-ac6e-66fff5eec59a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld" Mar 20 07:26:51 crc kubenswrapper[4749]: I0320 07:26:51.334864 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68f28f80-0a90-4e42-ac6e-66fff5eec59a-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld\" (UID: \"68f28f80-0a90-4e42-ac6e-66fff5eec59a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld" Mar 20 07:26:51 crc kubenswrapper[4749]: I0320 07:26:51.335139 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68f28f80-0a90-4e42-ac6e-66fff5eec59a-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld\" (UID: \"68f28f80-0a90-4e42-ac6e-66fff5eec59a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld" Mar 20 07:26:51 crc kubenswrapper[4749]: I0320 07:26:51.335143 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvls9\" (UniqueName: \"kubernetes.io/projected/68f28f80-0a90-4e42-ac6e-66fff5eec59a-kube-api-access-mvls9\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld\" (UID: \"68f28f80-0a90-4e42-ac6e-66fff5eec59a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld" Mar 20 07:26:51 crc kubenswrapper[4749]: I0320 07:26:51.357232 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvls9\" (UniqueName: \"kubernetes.io/projected/68f28f80-0a90-4e42-ac6e-66fff5eec59a-kube-api-access-mvls9\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld\" (UID: \"68f28f80-0a90-4e42-ac6e-66fff5eec59a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld" Mar 20 07:26:51 crc kubenswrapper[4749]: I0320 07:26:51.455365 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld" Mar 20 07:26:51 crc kubenswrapper[4749]: I0320 07:26:51.738675 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m896z" event={"ID":"2194420d-f882-4caf-bdd8-8942bdaadabf","Type":"ContainerStarted","Data":"922728821a5caa77948eeca479cbaa194eed3f012673a50eba2218b4fb29ba37"} Mar 20 07:26:51 crc kubenswrapper[4749]: I0320 07:26:51.954316 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m896z" podStartSLOduration=2.506795113 podStartE2EDuration="4.954266284s" podCreationTimestamp="2026-03-20 07:26:47 +0000 UTC" firstStartedPulling="2026-03-20 07:26:48.706899792 +0000 UTC m=+845.256557439" lastFinishedPulling="2026-03-20 07:26:51.154370963 +0000 UTC m=+847.704028610" observedRunningTime="2026-03-20 07:26:51.76943273 +0000 UTC m=+848.319090387" watchObservedRunningTime="2026-03-20 07:26:51.954266284 +0000 UTC m=+848.503923941" Mar 20 07:26:51 crc kubenswrapper[4749]: I0320 07:26:51.956422 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld"] Mar 20 07:26:51 crc kubenswrapper[4749]: W0320 07:26:51.962364 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68f28f80_0a90_4e42_ac6e_66fff5eec59a.slice/crio-039b4e224db86708c09ac56aa4ec3be73d78287f44567f49e1e12c0a91b294cb WatchSource:0}: Error finding container 039b4e224db86708c09ac56aa4ec3be73d78287f44567f49e1e12c0a91b294cb: Status 404 returned error can't find the container with id 039b4e224db86708c09ac56aa4ec3be73d78287f44567f49e1e12c0a91b294cb Mar 20 07:26:52 crc kubenswrapper[4749]: I0320 07:26:52.746625 4749 generic.go:334] "Generic (PLEG): container finished" podID="68f28f80-0a90-4e42-ac6e-66fff5eec59a" containerID="96e15a662ea67fdf955d3b3f5d1292c4221cfe87312606e0822b6695f6b766ba" exitCode=0 Mar 20 07:26:52 crc kubenswrapper[4749]: I0320 07:26:52.746752 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld" event={"ID":"68f28f80-0a90-4e42-ac6e-66fff5eec59a","Type":"ContainerDied","Data":"96e15a662ea67fdf955d3b3f5d1292c4221cfe87312606e0822b6695f6b766ba"} Mar 20 07:26:52 crc kubenswrapper[4749]: I0320 07:26:52.746812 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld" event={"ID":"68f28f80-0a90-4e42-ac6e-66fff5eec59a","Type":"ContainerStarted","Data":"039b4e224db86708c09ac56aa4ec3be73d78287f44567f49e1e12c0a91b294cb"} Mar 20 07:26:52 crc kubenswrapper[4749]: I0320 07:26:52.889923 4749 scope.go:117] "RemoveContainer" containerID="2ad0c0b0ad6dc2f74c3f216b96baf7c0c450cc28f3b1a294cb26e0160d039a78" Mar 20 07:26:54 crc kubenswrapper[4749]: I0320 07:26:54.768143 4749 generic.go:334] "Generic (PLEG): container finished" podID="68f28f80-0a90-4e42-ac6e-66fff5eec59a" containerID="3eb08933111c0d8d6338df82092e8a71d268c87e36664beae6b64cc4b09e855d" exitCode=0 Mar 20 07:26:54 crc kubenswrapper[4749]: I0320 07:26:54.768271 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld" event={"ID":"68f28f80-0a90-4e42-ac6e-66fff5eec59a","Type":"ContainerDied","Data":"3eb08933111c0d8d6338df82092e8a71d268c87e36664beae6b64cc4b09e855d"} Mar 20 07:26:55 crc kubenswrapper[4749]: I0320 07:26:55.781874 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld" event={"ID":"68f28f80-0a90-4e42-ac6e-66fff5eec59a","Type":"ContainerStarted","Data":"3c15c11547e1cfa2193b86f4e25f24725204ebcfa79f76f4f989aae71719afed"} Mar 20 07:26:56 crc kubenswrapper[4749]: I0320 07:26:56.803808 4749 generic.go:334] "Generic (PLEG): container finished" podID="68f28f80-0a90-4e42-ac6e-66fff5eec59a" containerID="3c15c11547e1cfa2193b86f4e25f24725204ebcfa79f76f4f989aae71719afed" exitCode=0 Mar 20 07:26:56 crc kubenswrapper[4749]: I0320 07:26:56.803866 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld" event={"ID":"68f28f80-0a90-4e42-ac6e-66fff5eec59a","Type":"ContainerDied","Data":"3c15c11547e1cfa2193b86f4e25f24725204ebcfa79f76f4f989aae71719afed"} Mar 20 07:26:57 crc kubenswrapper[4749]: I0320 07:26:57.102752 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld" Mar 20 07:26:57 crc kubenswrapper[4749]: I0320 07:26:57.215877 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvls9\" (UniqueName: \"kubernetes.io/projected/68f28f80-0a90-4e42-ac6e-66fff5eec59a-kube-api-access-mvls9\") pod \"68f28f80-0a90-4e42-ac6e-66fff5eec59a\" (UID: \"68f28f80-0a90-4e42-ac6e-66fff5eec59a\") " Mar 20 07:26:57 crc kubenswrapper[4749]: I0320 07:26:57.215952 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68f28f80-0a90-4e42-ac6e-66fff5eec59a-bundle\") pod \"68f28f80-0a90-4e42-ac6e-66fff5eec59a\" (UID: \"68f28f80-0a90-4e42-ac6e-66fff5eec59a\") " Mar 20 07:26:57 crc kubenswrapper[4749]: I0320 07:26:57.216078 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68f28f80-0a90-4e42-ac6e-66fff5eec59a-util\") pod \"68f28f80-0a90-4e42-ac6e-66fff5eec59a\" (UID: \"68f28f80-0a90-4e42-ac6e-66fff5eec59a\") " Mar 20 07:26:57 crc kubenswrapper[4749]: I0320 07:26:57.216819 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68f28f80-0a90-4e42-ac6e-66fff5eec59a-bundle" (OuterVolumeSpecName: "bundle") pod "68f28f80-0a90-4e42-ac6e-66fff5eec59a" (UID: "68f28f80-0a90-4e42-ac6e-66fff5eec59a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:26:57 crc kubenswrapper[4749]: I0320 07:26:57.222559 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68f28f80-0a90-4e42-ac6e-66fff5eec59a-kube-api-access-mvls9" (OuterVolumeSpecName: "kube-api-access-mvls9") pod "68f28f80-0a90-4e42-ac6e-66fff5eec59a" (UID: "68f28f80-0a90-4e42-ac6e-66fff5eec59a"). InnerVolumeSpecName "kube-api-access-mvls9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:26:57 crc kubenswrapper[4749]: I0320 07:26:57.237654 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68f28f80-0a90-4e42-ac6e-66fff5eec59a-util" (OuterVolumeSpecName: "util") pod "68f28f80-0a90-4e42-ac6e-66fff5eec59a" (UID: "68f28f80-0a90-4e42-ac6e-66fff5eec59a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:26:57 crc kubenswrapper[4749]: I0320 07:26:57.318011 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvls9\" (UniqueName: \"kubernetes.io/projected/68f28f80-0a90-4e42-ac6e-66fff5eec59a-kube-api-access-mvls9\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:57 crc kubenswrapper[4749]: I0320 07:26:57.318044 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68f28f80-0a90-4e42-ac6e-66fff5eec59a-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:57 crc kubenswrapper[4749]: I0320 07:26:57.318053 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68f28f80-0a90-4e42-ac6e-66fff5eec59a-util\") on node \"crc\" DevicePath \"\"" Mar 20 07:26:57 crc kubenswrapper[4749]: I0320 07:26:57.811792 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld" event={"ID":"68f28f80-0a90-4e42-ac6e-66fff5eec59a","Type":"ContainerDied","Data":"039b4e224db86708c09ac56aa4ec3be73d78287f44567f49e1e12c0a91b294cb"} Mar 20 07:26:57 crc kubenswrapper[4749]: I0320 07:26:57.812050 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="039b4e224db86708c09ac56aa4ec3be73d78287f44567f49e1e12c0a91b294cb" Mar 20 07:26:57 crc kubenswrapper[4749]: I0320 07:26:57.812097 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m896z" Mar 20 07:26:57 crc kubenswrapper[4749]: I0320 07:26:57.812115 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m896z" Mar 20 07:26:57 crc kubenswrapper[4749]: I0320 07:26:57.811867 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld" Mar 20 07:26:58 crc kubenswrapper[4749]: I0320 07:26:58.868642 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m896z" podUID="2194420d-f882-4caf-bdd8-8942bdaadabf" containerName="registry-server" probeResult="failure" output=< Mar 20 07:26:58 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Mar 20 07:26:58 crc kubenswrapper[4749]: > Mar 20 07:26:59 crc kubenswrapper[4749]: I0320 07:26:59.707423 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-hdj6p"] Mar 20 07:26:59 crc kubenswrapper[4749]: E0320 07:26:59.707667 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f28f80-0a90-4e42-ac6e-66fff5eec59a" containerName="extract" Mar 20 07:26:59 crc kubenswrapper[4749]: I0320 07:26:59.707686 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f28f80-0a90-4e42-ac6e-66fff5eec59a" containerName="extract" Mar 20 07:26:59 crc kubenswrapper[4749]: E0320 07:26:59.707714 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f28f80-0a90-4e42-ac6e-66fff5eec59a" containerName="util" Mar 20 07:26:59 crc kubenswrapper[4749]: I0320 07:26:59.707722 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f28f80-0a90-4e42-ac6e-66fff5eec59a" containerName="util" Mar 20 07:26:59 crc kubenswrapper[4749]: E0320 07:26:59.707733 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f28f80-0a90-4e42-ac6e-66fff5eec59a" containerName="pull" Mar 20 07:26:59 crc kubenswrapper[4749]: I0320 07:26:59.707742 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f28f80-0a90-4e42-ac6e-66fff5eec59a" containerName="pull" Mar 20 07:26:59 crc kubenswrapper[4749]: I0320 07:26:59.707870 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f28f80-0a90-4e42-ac6e-66fff5eec59a" containerName="extract" Mar 20 07:26:59 crc kubenswrapper[4749]: I0320 07:26:59.708367 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hdj6p" Mar 20 07:26:59 crc kubenswrapper[4749]: I0320 07:26:59.710131 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-svgj4" Mar 20 07:26:59 crc kubenswrapper[4749]: I0320 07:26:59.710274 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 20 07:26:59 crc kubenswrapper[4749]: I0320 07:26:59.711071 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 20 07:26:59 crc kubenswrapper[4749]: I0320 07:26:59.717209 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-hdj6p"] Mar 20 07:26:59 crc kubenswrapper[4749]: I0320 07:26:59.748825 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktjhd\" (UniqueName: \"kubernetes.io/projected/31193cc4-cc66-42d3-9029-2fcd90d1d9bc-kube-api-access-ktjhd\") pod \"nmstate-operator-796d4cfff4-hdj6p\" (UID: \"31193cc4-cc66-42d3-9029-2fcd90d1d9bc\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-hdj6p" Mar 20 07:26:59 crc kubenswrapper[4749]: I0320 07:26:59.850306 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktjhd\" (UniqueName: \"kubernetes.io/projected/31193cc4-cc66-42d3-9029-2fcd90d1d9bc-kube-api-access-ktjhd\") pod \"nmstate-operator-796d4cfff4-hdj6p\" (UID: \"31193cc4-cc66-42d3-9029-2fcd90d1d9bc\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-hdj6p" Mar 20 07:26:59 crc kubenswrapper[4749]: I0320 07:26:59.875353 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktjhd\" (UniqueName: \"kubernetes.io/projected/31193cc4-cc66-42d3-9029-2fcd90d1d9bc-kube-api-access-ktjhd\") pod \"nmstate-operator-796d4cfff4-hdj6p\" (UID: \"31193cc4-cc66-42d3-9029-2fcd90d1d9bc\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-hdj6p" Mar 20 07:27:00 crc kubenswrapper[4749]: I0320 07:27:00.024222 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hdj6p" Mar 20 07:27:00 crc kubenswrapper[4749]: I0320 07:27:00.287096 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-hdj6p"] Mar 20 07:27:00 crc kubenswrapper[4749]: W0320 07:27:00.291178 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31193cc4_cc66_42d3_9029_2fcd90d1d9bc.slice/crio-af795ad70fb25693f24a42506b003ca586a91380c1bfd48e0760771e55b05f7f WatchSource:0}: Error finding container af795ad70fb25693f24a42506b003ca586a91380c1bfd48e0760771e55b05f7f: Status 404 returned error can't find the container with id af795ad70fb25693f24a42506b003ca586a91380c1bfd48e0760771e55b05f7f Mar 20 07:27:00 crc kubenswrapper[4749]: I0320 07:27:00.841548 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hdj6p" event={"ID":"31193cc4-cc66-42d3-9029-2fcd90d1d9bc","Type":"ContainerStarted","Data":"af795ad70fb25693f24a42506b003ca586a91380c1bfd48e0760771e55b05f7f"} Mar 20 07:27:03 crc kubenswrapper[4749]: I0320 07:27:03.870543 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hdj6p" event={"ID":"31193cc4-cc66-42d3-9029-2fcd90d1d9bc","Type":"ContainerStarted","Data":"340b56936e7420e1ea482b7fbcb854ac483b0ec22e881aac6ebcb8d616a656a2"} Mar 20 07:27:03 crc kubenswrapper[4749]: I0320 07:27:03.911776 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hdj6p" podStartSLOduration=2.448635316 podStartE2EDuration="4.911745772s" podCreationTimestamp="2026-03-20 07:26:59 +0000 UTC" firstStartedPulling="2026-03-20 07:27:00.295496022 +0000 UTC m=+856.845153669" lastFinishedPulling="2026-03-20 07:27:02.758606478 +0000 UTC m=+859.308264125" observedRunningTime="2026-03-20 07:27:03.899504636 +0000 UTC m=+860.449162363" watchObservedRunningTime="2026-03-20 07:27:03.911745772 +0000 UTC m=+860.461403459" Mar 20 07:27:07 crc kubenswrapper[4749]: I0320 07:27:07.872672 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m896z" Mar 20 07:27:07 crc kubenswrapper[4749]: I0320 07:27:07.919029 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m896z" Mar 20 07:27:08 crc kubenswrapper[4749]: I0320 07:27:08.198155 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m896z"] Mar 20 07:27:08 crc kubenswrapper[4749]: I0320 07:27:08.910399 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m896z" podUID="2194420d-f882-4caf-bdd8-8942bdaadabf" containerName="registry-server" containerID="cri-o://922728821a5caa77948eeca479cbaa194eed3f012673a50eba2218b4fb29ba37" gracePeriod=2 Mar 20 07:27:09 crc kubenswrapper[4749]: I0320 07:27:09.347826 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m896z" Mar 20 07:27:09 crc kubenswrapper[4749]: I0320 07:27:09.491041 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nw8v\" (UniqueName: \"kubernetes.io/projected/2194420d-f882-4caf-bdd8-8942bdaadabf-kube-api-access-6nw8v\") pod \"2194420d-f882-4caf-bdd8-8942bdaadabf\" (UID: \"2194420d-f882-4caf-bdd8-8942bdaadabf\") " Mar 20 07:27:09 crc kubenswrapper[4749]: I0320 07:27:09.493391 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2194420d-f882-4caf-bdd8-8942bdaadabf-utilities\") pod \"2194420d-f882-4caf-bdd8-8942bdaadabf\" (UID: \"2194420d-f882-4caf-bdd8-8942bdaadabf\") " Mar 20 07:27:09 crc kubenswrapper[4749]: I0320 07:27:09.493588 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2194420d-f882-4caf-bdd8-8942bdaadabf-catalog-content\") pod \"2194420d-f882-4caf-bdd8-8942bdaadabf\" (UID: \"2194420d-f882-4caf-bdd8-8942bdaadabf\") " Mar 20 07:27:09 crc kubenswrapper[4749]: I0320 07:27:09.497719 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2194420d-f882-4caf-bdd8-8942bdaadabf-utilities" (OuterVolumeSpecName: "utilities") pod "2194420d-f882-4caf-bdd8-8942bdaadabf" (UID: "2194420d-f882-4caf-bdd8-8942bdaadabf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:27:09 crc kubenswrapper[4749]: I0320 07:27:09.511792 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2194420d-f882-4caf-bdd8-8942bdaadabf-kube-api-access-6nw8v" (OuterVolumeSpecName: "kube-api-access-6nw8v") pod "2194420d-f882-4caf-bdd8-8942bdaadabf" (UID: "2194420d-f882-4caf-bdd8-8942bdaadabf"). InnerVolumeSpecName "kube-api-access-6nw8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:27:09 crc kubenswrapper[4749]: I0320 07:27:09.596418 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nw8v\" (UniqueName: \"kubernetes.io/projected/2194420d-f882-4caf-bdd8-8942bdaadabf-kube-api-access-6nw8v\") on node \"crc\" DevicePath \"\"" Mar 20 07:27:09 crc kubenswrapper[4749]: I0320 07:27:09.596770 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2194420d-f882-4caf-bdd8-8942bdaadabf-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:27:09 crc kubenswrapper[4749]: I0320 07:27:09.678529 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2194420d-f882-4caf-bdd8-8942bdaadabf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2194420d-f882-4caf-bdd8-8942bdaadabf" (UID: "2194420d-f882-4caf-bdd8-8942bdaadabf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:27:09 crc kubenswrapper[4749]: I0320 07:27:09.697914 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2194420d-f882-4caf-bdd8-8942bdaadabf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:27:09 crc kubenswrapper[4749]: I0320 07:27:09.921432 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m896z" Mar 20 07:27:09 crc kubenswrapper[4749]: I0320 07:27:09.921238 4749 generic.go:334] "Generic (PLEG): container finished" podID="2194420d-f882-4caf-bdd8-8942bdaadabf" containerID="922728821a5caa77948eeca479cbaa194eed3f012673a50eba2218b4fb29ba37" exitCode=0 Mar 20 07:27:09 crc kubenswrapper[4749]: I0320 07:27:09.921477 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m896z" event={"ID":"2194420d-f882-4caf-bdd8-8942bdaadabf","Type":"ContainerDied","Data":"922728821a5caa77948eeca479cbaa194eed3f012673a50eba2218b4fb29ba37"} Mar 20 07:27:09 crc kubenswrapper[4749]: I0320 07:27:09.928470 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m896z" event={"ID":"2194420d-f882-4caf-bdd8-8942bdaadabf","Type":"ContainerDied","Data":"ea0e95e16d958d60ca6f067e22692708a1609f2d9069208163b81cd940f9dc7b"} Mar 20 07:27:09 crc kubenswrapper[4749]: I0320 07:27:09.928511 4749 scope.go:117] "RemoveContainer" containerID="922728821a5caa77948eeca479cbaa194eed3f012673a50eba2218b4fb29ba37" Mar 20 07:27:09 crc kubenswrapper[4749]: I0320 07:27:09.957090 4749 scope.go:117] "RemoveContainer" containerID="cd73b990aaca9c5ffe1a74b128e626f81e1c43e00b29a52136c777433ef05d00" Mar 20 07:27:09 crc kubenswrapper[4749]: I0320 07:27:09.959624 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m896z"] Mar 20 07:27:09 crc kubenswrapper[4749]: I0320 07:27:09.964731 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m896z"] Mar 20 07:27:09 crc kubenswrapper[4749]: I0320 07:27:09.990300 4749 scope.go:117] "RemoveContainer" containerID="428ca903ca2c7775c7b25490756bf0ed2ba820c35461cb3886b8e9c34c599a0e" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.002665 4749 scope.go:117] "RemoveContainer" containerID="922728821a5caa77948eeca479cbaa194eed3f012673a50eba2218b4fb29ba37" Mar 20 07:27:10 crc kubenswrapper[4749]: E0320 07:27:10.003071 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"922728821a5caa77948eeca479cbaa194eed3f012673a50eba2218b4fb29ba37\": container with ID starting with 922728821a5caa77948eeca479cbaa194eed3f012673a50eba2218b4fb29ba37 not found: ID does not exist" containerID="922728821a5caa77948eeca479cbaa194eed3f012673a50eba2218b4fb29ba37" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.003126 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"922728821a5caa77948eeca479cbaa194eed3f012673a50eba2218b4fb29ba37"} err="failed to get container status \"922728821a5caa77948eeca479cbaa194eed3f012673a50eba2218b4fb29ba37\": rpc error: code = NotFound desc = could not find container \"922728821a5caa77948eeca479cbaa194eed3f012673a50eba2218b4fb29ba37\": container with ID starting with 922728821a5caa77948eeca479cbaa194eed3f012673a50eba2218b4fb29ba37 not found: ID does not exist" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.003153 4749 scope.go:117] "RemoveContainer" containerID="cd73b990aaca9c5ffe1a74b128e626f81e1c43e00b29a52136c777433ef05d00" Mar 20 07:27:10 crc kubenswrapper[4749]: E0320 07:27:10.003454 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd73b990aaca9c5ffe1a74b128e626f81e1c43e00b29a52136c777433ef05d00\": container with ID starting with cd73b990aaca9c5ffe1a74b128e626f81e1c43e00b29a52136c777433ef05d00 not found: ID does not exist" containerID="cd73b990aaca9c5ffe1a74b128e626f81e1c43e00b29a52136c777433ef05d00" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.003487 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd73b990aaca9c5ffe1a74b128e626f81e1c43e00b29a52136c777433ef05d00"} err="failed to get container status \"cd73b990aaca9c5ffe1a74b128e626f81e1c43e00b29a52136c777433ef05d00\": rpc error: code = NotFound desc = could not find container \"cd73b990aaca9c5ffe1a74b128e626f81e1c43e00b29a52136c777433ef05d00\": container with ID starting with cd73b990aaca9c5ffe1a74b128e626f81e1c43e00b29a52136c777433ef05d00 not found: ID does not exist" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.003508 4749 scope.go:117] "RemoveContainer" containerID="428ca903ca2c7775c7b25490756bf0ed2ba820c35461cb3886b8e9c34c599a0e" Mar 20 07:27:10 crc kubenswrapper[4749]: E0320 07:27:10.003778 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"428ca903ca2c7775c7b25490756bf0ed2ba820c35461cb3886b8e9c34c599a0e\": container with ID starting with 428ca903ca2c7775c7b25490756bf0ed2ba820c35461cb3886b8e9c34c599a0e not found: ID does not exist" containerID="428ca903ca2c7775c7b25490756bf0ed2ba820c35461cb3886b8e9c34c599a0e" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.003800 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"428ca903ca2c7775c7b25490756bf0ed2ba820c35461cb3886b8e9c34c599a0e"} err="failed to get container status \"428ca903ca2c7775c7b25490756bf0ed2ba820c35461cb3886b8e9c34c599a0e\": rpc error: code = NotFound desc = could not find container \"428ca903ca2c7775c7b25490756bf0ed2ba820c35461cb3886b8e9c34c599a0e\": container with ID starting with 428ca903ca2c7775c7b25490756bf0ed2ba820c35461cb3886b8e9c34c599a0e not found: ID does not exist" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.187090 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2194420d-f882-4caf-bdd8-8942bdaadabf" path="/var/lib/kubelet/pods/2194420d-f882-4caf-bdd8-8942bdaadabf/volumes" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.364962 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-ggtqr"] Mar 20 07:27:10 crc kubenswrapper[4749]: E0320 07:27:10.365368 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2194420d-f882-4caf-bdd8-8942bdaadabf" containerName="extract-utilities" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.365400 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2194420d-f882-4caf-bdd8-8942bdaadabf" containerName="extract-utilities" Mar 20 07:27:10 crc kubenswrapper[4749]: E0320 07:27:10.365418 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2194420d-f882-4caf-bdd8-8942bdaadabf" containerName="registry-server" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.365430 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2194420d-f882-4caf-bdd8-8942bdaadabf" containerName="registry-server" Mar 20 07:27:10 crc kubenswrapper[4749]: E0320 07:27:10.365448 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2194420d-f882-4caf-bdd8-8942bdaadabf" containerName="extract-content" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.365491 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2194420d-f882-4caf-bdd8-8942bdaadabf" containerName="extract-content" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.365689 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2194420d-f882-4caf-bdd8-8942bdaadabf" containerName="registry-server" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.366682 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ggtqr" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.368929 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-fld9f" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.374678 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-qzwd2"] Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.375798 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qzwd2" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.378786 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.385227 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-ggtqr"] Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.405367 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-qzwd2"] Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.411243 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-cmf78"] Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.412110 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-cmf78" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.496922 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-nzfz2"] Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.497754 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nzfz2" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.500307 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.500912 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.501095 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-gz8jm" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.507706 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/74b419be-dfe1-4b8c-a6b1-79b50e32b335-ovs-socket\") pod \"nmstate-handler-cmf78\" (UID: \"74b419be-dfe1-4b8c-a6b1-79b50e32b335\") " pod="openshift-nmstate/nmstate-handler-cmf78" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.507824 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbdfg\" (UniqueName: \"kubernetes.io/projected/80061915-6756-40d7-9f66-71248a0255dd-kube-api-access-xbdfg\") pod \"nmstate-metrics-9b8c8685d-ggtqr\" (UID: \"80061915-6756-40d7-9f66-71248a0255dd\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ggtqr" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.507913 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/45ebdb11-6585-4874-986c-7a5f0e456e26-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-qzwd2\" (UID: \"45ebdb11-6585-4874-986c-7a5f0e456e26\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qzwd2" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.507932 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/74b419be-dfe1-4b8c-a6b1-79b50e32b335-nmstate-lock\") pod \"nmstate-handler-cmf78\" (UID: \"74b419be-dfe1-4b8c-a6b1-79b50e32b335\") " pod="openshift-nmstate/nmstate-handler-cmf78" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.507976 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggxm2\" (UniqueName: \"kubernetes.io/projected/45ebdb11-6585-4874-986c-7a5f0e456e26-kube-api-access-ggxm2\") pod \"nmstate-webhook-5f558f5558-qzwd2\" (UID: \"45ebdb11-6585-4874-986c-7a5f0e456e26\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qzwd2" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.507993 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/74b419be-dfe1-4b8c-a6b1-79b50e32b335-dbus-socket\") pod \"nmstate-handler-cmf78\" (UID: \"74b419be-dfe1-4b8c-a6b1-79b50e32b335\") " pod="openshift-nmstate/nmstate-handler-cmf78" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.508546 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nffwh\" (UniqueName: \"kubernetes.io/projected/74b419be-dfe1-4b8c-a6b1-79b50e32b335-kube-api-access-nffwh\") pod \"nmstate-handler-cmf78\" (UID: \"74b419be-dfe1-4b8c-a6b1-79b50e32b335\") " pod="openshift-nmstate/nmstate-handler-cmf78" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.509140 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-nzfz2"] Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.609150 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk8wd\" (UniqueName: \"kubernetes.io/projected/e8b78869-90fe-4c18-9cc0-6605ad9ecdbc-kube-api-access-gk8wd\") pod \"nmstate-console-plugin-86f58fcf4-nzfz2\" (UID: \"e8b78869-90fe-4c18-9cc0-6605ad9ecdbc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nzfz2" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.609201 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/74b419be-dfe1-4b8c-a6b1-79b50e32b335-nmstate-lock\") pod \"nmstate-handler-cmf78\" (UID: \"74b419be-dfe1-4b8c-a6b1-79b50e32b335\") " pod="openshift-nmstate/nmstate-handler-cmf78" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.609219 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/45ebdb11-6585-4874-986c-7a5f0e456e26-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-qzwd2\" (UID: \"45ebdb11-6585-4874-986c-7a5f0e456e26\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qzwd2" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.609258 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggxm2\" (UniqueName: \"kubernetes.io/projected/45ebdb11-6585-4874-986c-7a5f0e456e26-kube-api-access-ggxm2\") pod \"nmstate-webhook-5f558f5558-qzwd2\" (UID: \"45ebdb11-6585-4874-986c-7a5f0e456e26\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qzwd2" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.609290 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/74b419be-dfe1-4b8c-a6b1-79b50e32b335-dbus-socket\") pod \"nmstate-handler-cmf78\" (UID: \"74b419be-dfe1-4b8c-a6b1-79b50e32b335\") " pod="openshift-nmstate/nmstate-handler-cmf78" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.609308 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e8b78869-90fe-4c18-9cc0-6605ad9ecdbc-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-nzfz2\" (UID: \"e8b78869-90fe-4c18-9cc0-6605ad9ecdbc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nzfz2" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.609321 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/74b419be-dfe1-4b8c-a6b1-79b50e32b335-nmstate-lock\") pod \"nmstate-handler-cmf78\" (UID: \"74b419be-dfe1-4b8c-a6b1-79b50e32b335\") " pod="openshift-nmstate/nmstate-handler-cmf78" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.609333 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nffwh\" (UniqueName: \"kubernetes.io/projected/74b419be-dfe1-4b8c-a6b1-79b50e32b335-kube-api-access-nffwh\") pod \"nmstate-handler-cmf78\" (UID: \"74b419be-dfe1-4b8c-a6b1-79b50e32b335\") " pod="openshift-nmstate/nmstate-handler-cmf78" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.609434 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/74b419be-dfe1-4b8c-a6b1-79b50e32b335-ovs-socket\") pod \"nmstate-handler-cmf78\" (UID: \"74b419be-dfe1-4b8c-a6b1-79b50e32b335\") " pod="openshift-nmstate/nmstate-handler-cmf78" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.609463 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbdfg\" (UniqueName: \"kubernetes.io/projected/80061915-6756-40d7-9f66-71248a0255dd-kube-api-access-xbdfg\") pod \"nmstate-metrics-9b8c8685d-ggtqr\" (UID: \"80061915-6756-40d7-9f66-71248a0255dd\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ggtqr" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.609490 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e8b78869-90fe-4c18-9cc0-6605ad9ecdbc-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-nzfz2\" (UID: \"e8b78869-90fe-4c18-9cc0-6605ad9ecdbc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nzfz2" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.609530 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/74b419be-dfe1-4b8c-a6b1-79b50e32b335-ovs-socket\") pod \"nmstate-handler-cmf78\" (UID: \"74b419be-dfe1-4b8c-a6b1-79b50e32b335\") " pod="openshift-nmstate/nmstate-handler-cmf78" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.609621 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/74b419be-dfe1-4b8c-a6b1-79b50e32b335-dbus-socket\") pod \"nmstate-handler-cmf78\" (UID: \"74b419be-dfe1-4b8c-a6b1-79b50e32b335\") " pod="openshift-nmstate/nmstate-handler-cmf78" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.613453 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/45ebdb11-6585-4874-986c-7a5f0e456e26-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-qzwd2\" (UID: \"45ebdb11-6585-4874-986c-7a5f0e456e26\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qzwd2" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.632959 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbdfg\" (UniqueName: \"kubernetes.io/projected/80061915-6756-40d7-9f66-71248a0255dd-kube-api-access-xbdfg\") pod \"nmstate-metrics-9b8c8685d-ggtqr\" (UID: \"80061915-6756-40d7-9f66-71248a0255dd\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ggtqr" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.633172 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggxm2\" (UniqueName: \"kubernetes.io/projected/45ebdb11-6585-4874-986c-7a5f0e456e26-kube-api-access-ggxm2\") pod \"nmstate-webhook-5f558f5558-qzwd2\" (UID: \"45ebdb11-6585-4874-986c-7a5f0e456e26\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qzwd2" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.633413 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nffwh\" (UniqueName: \"kubernetes.io/projected/74b419be-dfe1-4b8c-a6b1-79b50e32b335-kube-api-access-nffwh\") pod \"nmstate-handler-cmf78\" (UID: \"74b419be-dfe1-4b8c-a6b1-79b50e32b335\") " pod="openshift-nmstate/nmstate-handler-cmf78" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.681968 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-59876b96ff-dxx25"] Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.683006 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.691421 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ggtqr" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.699429 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qzwd2" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.710383 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e8b78869-90fe-4c18-9cc0-6605ad9ecdbc-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-nzfz2\" (UID: \"e8b78869-90fe-4c18-9cc0-6605ad9ecdbc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nzfz2" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.710441 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk8wd\" (UniqueName: \"kubernetes.io/projected/e8b78869-90fe-4c18-9cc0-6605ad9ecdbc-kube-api-access-gk8wd\") pod \"nmstate-console-plugin-86f58fcf4-nzfz2\" (UID: \"e8b78869-90fe-4c18-9cc0-6605ad9ecdbc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nzfz2" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.710472 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e8b78869-90fe-4c18-9cc0-6605ad9ecdbc-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-nzfz2\" (UID: \"e8b78869-90fe-4c18-9cc0-6605ad9ecdbc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nzfz2" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.711134 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e8b78869-90fe-4c18-9cc0-6605ad9ecdbc-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-nzfz2\" (UID: \"e8b78869-90fe-4c18-9cc0-6605ad9ecdbc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nzfz2" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.712596 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59876b96ff-dxx25"] Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.716367 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e8b78869-90fe-4c18-9cc0-6605ad9ecdbc-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-nzfz2\" (UID: \"e8b78869-90fe-4c18-9cc0-6605ad9ecdbc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nzfz2" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.731556 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-cmf78" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.759973 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk8wd\" (UniqueName: \"kubernetes.io/projected/e8b78869-90fe-4c18-9cc0-6605ad9ecdbc-kube-api-access-gk8wd\") pod \"nmstate-console-plugin-86f58fcf4-nzfz2\" (UID: \"e8b78869-90fe-4c18-9cc0-6605ad9ecdbc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nzfz2" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.808777 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.811139 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1c043e12-1d08-458b-bc62-83faceeae7d7-service-ca\") pod \"console-59876b96ff-dxx25\" (UID: \"1c043e12-1d08-458b-bc62-83faceeae7d7\") " pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.811171 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c043e12-1d08-458b-bc62-83faceeae7d7-console-serving-cert\") pod \"console-59876b96ff-dxx25\" (UID: \"1c043e12-1d08-458b-bc62-83faceeae7d7\") " pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.811201 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1c043e12-1d08-458b-bc62-83faceeae7d7-console-oauth-config\") pod \"console-59876b96ff-dxx25\" (UID: \"1c043e12-1d08-458b-bc62-83faceeae7d7\") " pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.811222 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxx9c\" (UniqueName: \"kubernetes.io/projected/1c043e12-1d08-458b-bc62-83faceeae7d7-kube-api-access-nxx9c\") pod \"console-59876b96ff-dxx25\" (UID: \"1c043e12-1d08-458b-bc62-83faceeae7d7\") " pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.811237 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c043e12-1d08-458b-bc62-83faceeae7d7-trusted-ca-bundle\") pod \"console-59876b96ff-dxx25\" (UID: \"1c043e12-1d08-458b-bc62-83faceeae7d7\") " pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.811256 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1c043e12-1d08-458b-bc62-83faceeae7d7-console-config\") pod \"console-59876b96ff-dxx25\" (UID: \"1c043e12-1d08-458b-bc62-83faceeae7d7\") " pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.811322 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1c043e12-1d08-458b-bc62-83faceeae7d7-oauth-serving-cert\") pod \"console-59876b96ff-dxx25\" (UID: \"1c043e12-1d08-458b-bc62-83faceeae7d7\") " pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.814140 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nzfz2" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.912831 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1c043e12-1d08-458b-bc62-83faceeae7d7-oauth-serving-cert\") pod \"console-59876b96ff-dxx25\" (UID: \"1c043e12-1d08-458b-bc62-83faceeae7d7\") " pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.912896 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1c043e12-1d08-458b-bc62-83faceeae7d7-service-ca\") pod \"console-59876b96ff-dxx25\" (UID: \"1c043e12-1d08-458b-bc62-83faceeae7d7\") " pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.912915 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c043e12-1d08-458b-bc62-83faceeae7d7-console-serving-cert\") pod \"console-59876b96ff-dxx25\" (UID: \"1c043e12-1d08-458b-bc62-83faceeae7d7\") " pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.912964 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1c043e12-1d08-458b-bc62-83faceeae7d7-console-oauth-config\") pod \"console-59876b96ff-dxx25\" (UID: \"1c043e12-1d08-458b-bc62-83faceeae7d7\") " pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.912984 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxx9c\" (UniqueName: \"kubernetes.io/projected/1c043e12-1d08-458b-bc62-83faceeae7d7-kube-api-access-nxx9c\") pod \"console-59876b96ff-dxx25\" (UID: \"1c043e12-1d08-458b-bc62-83faceeae7d7\") " pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.913117 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c043e12-1d08-458b-bc62-83faceeae7d7-trusted-ca-bundle\") pod \"console-59876b96ff-dxx25\" (UID: \"1c043e12-1d08-458b-bc62-83faceeae7d7\") " pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.913987 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1c043e12-1d08-458b-bc62-83faceeae7d7-oauth-serving-cert\") pod \"console-59876b96ff-dxx25\" (UID: \"1c043e12-1d08-458b-bc62-83faceeae7d7\") " pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.914050 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1c043e12-1d08-458b-bc62-83faceeae7d7-service-ca\") pod \"console-59876b96ff-dxx25\" (UID: \"1c043e12-1d08-458b-bc62-83faceeae7d7\") " pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.914273 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1c043e12-1d08-458b-bc62-83faceeae7d7-console-config\") pod \"console-59876b96ff-dxx25\" (UID: \"1c043e12-1d08-458b-bc62-83faceeae7d7\") " pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.914555 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c043e12-1d08-458b-bc62-83faceeae7d7-trusted-ca-bundle\") pod \"console-59876b96ff-dxx25\" (UID: \"1c043e12-1d08-458b-bc62-83faceeae7d7\") " pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.915077 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1c043e12-1d08-458b-bc62-83faceeae7d7-console-config\") pod \"console-59876b96ff-dxx25\" (UID: \"1c043e12-1d08-458b-bc62-83faceeae7d7\") " pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.917890 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1c043e12-1d08-458b-bc62-83faceeae7d7-console-oauth-config\") pod \"console-59876b96ff-dxx25\" (UID: \"1c043e12-1d08-458b-bc62-83faceeae7d7\") " pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.919586 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c043e12-1d08-458b-bc62-83faceeae7d7-console-serving-cert\") pod \"console-59876b96ff-dxx25\" (UID: \"1c043e12-1d08-458b-bc62-83faceeae7d7\") " pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.931510 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxx9c\" (UniqueName: \"kubernetes.io/projected/1c043e12-1d08-458b-bc62-83faceeae7d7-kube-api-access-nxx9c\") pod \"console-59876b96ff-dxx25\" (UID: \"1c043e12-1d08-458b-bc62-83faceeae7d7\") " pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.940815 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-cmf78" event={"ID":"74b419be-dfe1-4b8c-a6b1-79b50e32b335","Type":"ContainerStarted","Data":"c01ba38b8681a28249e527cda2238035873da6f9ddbde3d652f78d10510e6436"} Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.941473 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-ggtqr"] Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.957962 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-qzwd2"] Mar 20 07:27:10 crc kubenswrapper[4749]: W0320 07:27:10.964487 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45ebdb11_6585_4874_986c_7a5f0e456e26.slice/crio-1296cbab076636afca6901906bb9539a1bc3933e667bd9c8d855e45b88f8a16d WatchSource:0}: Error finding container 1296cbab076636afca6901906bb9539a1bc3933e667bd9c8d855e45b88f8a16d: Status 404 returned error can't find the container with id 1296cbab076636afca6901906bb9539a1bc3933e667bd9c8d855e45b88f8a16d Mar 20 07:27:10 crc kubenswrapper[4749]: I0320 07:27:10.999163 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:11 crc kubenswrapper[4749]: I0320 07:27:11.020036 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-nzfz2"] Mar 20 07:27:11 crc kubenswrapper[4749]: I0320 07:27:11.184990 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59876b96ff-dxx25"] Mar 20 07:27:11 crc kubenswrapper[4749]: W0320 07:27:11.185838 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c043e12_1d08_458b_bc62_83faceeae7d7.slice/crio-6319a4df48e349c9884696ae4889c302947734ab12e5b67dd5909df4c9d00fa8 WatchSource:0}: Error finding container 6319a4df48e349c9884696ae4889c302947734ab12e5b67dd5909df4c9d00fa8: Status 404 returned error can't find the container with id 6319a4df48e349c9884696ae4889c302947734ab12e5b67dd5909df4c9d00fa8 Mar 20 07:27:11 crc kubenswrapper[4749]: I0320 07:27:11.968418 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ggtqr" event={"ID":"80061915-6756-40d7-9f66-71248a0255dd","Type":"ContainerStarted","Data":"995419d32c0c8bb962271fafb3160adbc51dfc5677eff0e8d5861354d0be8f79"} Mar 20 07:27:11 crc kubenswrapper[4749]: I0320 07:27:11.970749 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59876b96ff-dxx25" event={"ID":"1c043e12-1d08-458b-bc62-83faceeae7d7","Type":"ContainerStarted","Data":"20eb73746908c3504f3a2d076bbe067f9fd479fd3a0ec93ae0eecbab371c662f"} Mar 20 07:27:11 crc kubenswrapper[4749]: I0320 07:27:11.970794 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59876b96ff-dxx25" event={"ID":"1c043e12-1d08-458b-bc62-83faceeae7d7","Type":"ContainerStarted","Data":"6319a4df48e349c9884696ae4889c302947734ab12e5b67dd5909df4c9d00fa8"} Mar 20 07:27:11 crc kubenswrapper[4749]: I0320 07:27:11.972675 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qzwd2" event={"ID":"45ebdb11-6585-4874-986c-7a5f0e456e26","Type":"ContainerStarted","Data":"1296cbab076636afca6901906bb9539a1bc3933e667bd9c8d855e45b88f8a16d"} Mar 20 07:27:11 crc kubenswrapper[4749]: I0320 07:27:11.974078 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nzfz2" event={"ID":"e8b78869-90fe-4c18-9cc0-6605ad9ecdbc","Type":"ContainerStarted","Data":"a1ac7acb56f0ca34b4a43059a77123ac8b64101c6fb526f4c4be86c5896f9fb8"} Mar 20 07:27:11 crc kubenswrapper[4749]: I0320 07:27:11.994694 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59876b96ff-dxx25" podStartSLOduration=1.994660969 podStartE2EDuration="1.994660969s" podCreationTimestamp="2026-03-20 07:27:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:27:11.991871932 +0000 UTC m=+868.541529609" watchObservedRunningTime="2026-03-20 07:27:11.994660969 +0000 UTC m=+868.544318656" Mar 20 07:27:14 crc kubenswrapper[4749]: I0320 07:27:14.994317 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-cmf78" event={"ID":"74b419be-dfe1-4b8c-a6b1-79b50e32b335","Type":"ContainerStarted","Data":"c32a13b3382ad3064104db8e39e4ae79080b6329c0b174bf45e8888c0db4aa72"} Mar 20 07:27:14 crc kubenswrapper[4749]: I0320 07:27:14.994927 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-cmf78" Mar 20 07:27:14 crc kubenswrapper[4749]: I0320 07:27:14.997432 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nzfz2" event={"ID":"e8b78869-90fe-4c18-9cc0-6605ad9ecdbc","Type":"ContainerStarted","Data":"07ebe07c401c65737f90aa2b2d9fab36f7b7f9ba5c479460e563b329f4cdbb31"} Mar 20 07:27:14 crc kubenswrapper[4749]: I0320 07:27:14.999807 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ggtqr" event={"ID":"80061915-6756-40d7-9f66-71248a0255dd","Type":"ContainerStarted","Data":"0a6bf2013705936042304f7b21c5c31821bd1e9c651fb344c825453d313a5992"} Mar 20 07:27:15 crc kubenswrapper[4749]: I0320 07:27:15.001633 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qzwd2" event={"ID":"45ebdb11-6585-4874-986c-7a5f0e456e26","Type":"ContainerStarted","Data":"204e7cb236d9ef4daf4d735e7f535c9bee10755fa7ee608ff5ad61b037b8ea1e"} Mar 20 07:27:15 crc kubenswrapper[4749]: I0320 07:27:15.002432 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qzwd2" Mar 20 07:27:15 crc kubenswrapper[4749]: I0320 07:27:15.019087 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-cmf78" podStartSLOduration=1.9788196569999998 podStartE2EDuration="5.019056422s" podCreationTimestamp="2026-03-20 07:27:10 +0000 UTC" firstStartedPulling="2026-03-20 07:27:10.80855048 +0000 UTC m=+867.358208127" lastFinishedPulling="2026-03-20 07:27:13.848787215 +0000 UTC m=+870.398444892" observedRunningTime="2026-03-20 07:27:15.016844798 +0000 UTC m=+871.566502475" watchObservedRunningTime="2026-03-20 07:27:15.019056422 +0000 UTC m=+871.568714079" Mar 20 07:27:15 crc kubenswrapper[4749]: I0320 07:27:15.040405 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nzfz2" podStartSLOduration=2.223760843 podStartE2EDuration="5.040383947s" podCreationTimestamp="2026-03-20 07:27:10 +0000 UTC" firstStartedPulling="2026-03-20 07:27:11.024932246 +0000 UTC m=+867.574589903" lastFinishedPulling="2026-03-20 07:27:13.84155536 +0000 UTC m=+870.391213007" observedRunningTime="2026-03-20 07:27:15.038516922 +0000 UTC m=+871.588174599" watchObservedRunningTime="2026-03-20 07:27:15.040383947 +0000 UTC m=+871.590041594" Mar 20 07:27:15 crc kubenswrapper[4749]: I0320 07:27:15.070840 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qzwd2" podStartSLOduration=2.18881967 podStartE2EDuration="5.070822263s" podCreationTimestamp="2026-03-20 07:27:10 +0000 UTC" firstStartedPulling="2026-03-20 07:27:10.966801792 +0000 UTC m=+867.516459439" lastFinishedPulling="2026-03-20 07:27:13.848804345 +0000 UTC m=+870.398462032" observedRunningTime="2026-03-20 07:27:15.064431598 +0000 UTC m=+871.614089245" watchObservedRunningTime="2026-03-20 07:27:15.070822263 +0000 UTC m=+871.620479910" Mar 20 07:27:17 crc kubenswrapper[4749]: I0320 07:27:17.015218 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ggtqr" event={"ID":"80061915-6756-40d7-9f66-71248a0255dd","Type":"ContainerStarted","Data":"39b30e783a8dc818218ab69cd684615776a9af40428b8a54975f1796c1339c45"} Mar 20 07:27:17 crc kubenswrapper[4749]: I0320 07:27:17.036786 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ggtqr" podStartSLOduration=1.660342854 podStartE2EDuration="7.036751459s" podCreationTimestamp="2026-03-20 07:27:10 +0000 UTC" firstStartedPulling="2026-03-20 07:27:10.950109169 +0000 UTC m=+867.499766816" lastFinishedPulling="2026-03-20 07:27:16.326517784 +0000 UTC m=+872.876175421" observedRunningTime="2026-03-20 07:27:17.0297465 +0000 UTC m=+873.579404197" watchObservedRunningTime="2026-03-20 07:27:17.036751459 +0000 UTC m=+873.586409146" Mar 20 07:27:20 crc kubenswrapper[4749]: I0320 07:27:20.760074 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-cmf78" Mar 20 07:27:20 crc kubenswrapper[4749]: I0320 07:27:20.999693 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:20 crc kubenswrapper[4749]: I0320 07:27:20.999897 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:21 crc kubenswrapper[4749]: I0320 07:27:21.008017 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:21 crc kubenswrapper[4749]: I0320 07:27:21.044889 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-59876b96ff-dxx25" Mar 20 07:27:21 crc kubenswrapper[4749]: I0320 07:27:21.093318 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-2zlqs"] Mar 20 07:27:30 crc kubenswrapper[4749]: I0320 07:27:30.709975 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qzwd2" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.140498 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-2zlqs" podUID="ff4ae6b4-eebc-4a32-b390-ec7ea70c8841" containerName="console" containerID="cri-o://bd50e4fa543b07ebf9d0f5e3db4b3c9481c8d248ce432c831dcae0628425657d" gracePeriod=15 Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.537034 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-2zlqs_ff4ae6b4-eebc-4a32-b390-ec7ea70c8841/console/0.log" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.537214 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.554823 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9"] Mar 20 07:27:46 crc kubenswrapper[4749]: E0320 07:27:46.555052 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff4ae6b4-eebc-4a32-b390-ec7ea70c8841" containerName="console" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.555069 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff4ae6b4-eebc-4a32-b390-ec7ea70c8841" containerName="console" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.555194 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff4ae6b4-eebc-4a32-b390-ec7ea70c8841" containerName="console" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.555937 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.559364 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.573753 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9"] Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.605523 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-service-ca\") pod \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.605565 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-trusted-ca-bundle\") pod \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.605599 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-console-serving-cert\") pod \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.605630 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-console-oauth-config\") pod \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.605700 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx7wr\" (UniqueName: \"kubernetes.io/projected/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-kube-api-access-cx7wr\") pod \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.605760 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-oauth-serving-cert\") pod \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.605793 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-console-config\") pod \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\" (UID: \"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841\") " Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.606019 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbk95\" (UniqueName: \"kubernetes.io/projected/0d7f85fc-7895-4f42-8cc8-587c5f7f0f21-kube-api-access-lbk95\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9\" (UID: \"0d7f85fc-7895-4f42-8cc8-587c5f7f0f21\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.606052 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0d7f85fc-7895-4f42-8cc8-587c5f7f0f21-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9\" (UID: \"0d7f85fc-7895-4f42-8cc8-587c5f7f0f21\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.606073 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0d7f85fc-7895-4f42-8cc8-587c5f7f0f21-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9\" (UID: \"0d7f85fc-7895-4f42-8cc8-587c5f7f0f21\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.606504 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ff4ae6b4-eebc-4a32-b390-ec7ea70c8841" (UID: "ff4ae6b4-eebc-4a32-b390-ec7ea70c8841"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.606541 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-console-config" (OuterVolumeSpecName: "console-config") pod "ff4ae6b4-eebc-4a32-b390-ec7ea70c8841" (UID: "ff4ae6b4-eebc-4a32-b390-ec7ea70c8841"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.606566 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-service-ca" (OuterVolumeSpecName: "service-ca") pod "ff4ae6b4-eebc-4a32-b390-ec7ea70c8841" (UID: "ff4ae6b4-eebc-4a32-b390-ec7ea70c8841"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.607395 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ff4ae6b4-eebc-4a32-b390-ec7ea70c8841" (UID: "ff4ae6b4-eebc-4a32-b390-ec7ea70c8841"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.611836 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ff4ae6b4-eebc-4a32-b390-ec7ea70c8841" (UID: "ff4ae6b4-eebc-4a32-b390-ec7ea70c8841"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.612047 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ff4ae6b4-eebc-4a32-b390-ec7ea70c8841" (UID: "ff4ae6b4-eebc-4a32-b390-ec7ea70c8841"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.613076 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-kube-api-access-cx7wr" (OuterVolumeSpecName: "kube-api-access-cx7wr") pod "ff4ae6b4-eebc-4a32-b390-ec7ea70c8841" (UID: "ff4ae6b4-eebc-4a32-b390-ec7ea70c8841"). InnerVolumeSpecName "kube-api-access-cx7wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.706983 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbk95\" (UniqueName: \"kubernetes.io/projected/0d7f85fc-7895-4f42-8cc8-587c5f7f0f21-kube-api-access-lbk95\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9\" (UID: \"0d7f85fc-7895-4f42-8cc8-587c5f7f0f21\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.707036 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0d7f85fc-7895-4f42-8cc8-587c5f7f0f21-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9\" (UID: \"0d7f85fc-7895-4f42-8cc8-587c5f7f0f21\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.707062 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0d7f85fc-7895-4f42-8cc8-587c5f7f0f21-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9\" (UID: \"0d7f85fc-7895-4f42-8cc8-587c5f7f0f21\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.707181 4749 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.707196 4749 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.707207 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.707216 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.707227 4749 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.707236 4749 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.707248 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx7wr\" (UniqueName: \"kubernetes.io/projected/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841-kube-api-access-cx7wr\") on node \"crc\" DevicePath \"\"" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.707791 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0d7f85fc-7895-4f42-8cc8-587c5f7f0f21-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9\" (UID: \"0d7f85fc-7895-4f42-8cc8-587c5f7f0f21\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.707846 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0d7f85fc-7895-4f42-8cc8-587c5f7f0f21-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9\" (UID: \"0d7f85fc-7895-4f42-8cc8-587c5f7f0f21\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.725437 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbk95\" (UniqueName: \"kubernetes.io/projected/0d7f85fc-7895-4f42-8cc8-587c5f7f0f21-kube-api-access-lbk95\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9\" (UID: \"0d7f85fc-7895-4f42-8cc8-587c5f7f0f21\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9" Mar 20 07:27:46 crc kubenswrapper[4749]: I0320 07:27:46.888987 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9" Mar 20 07:27:47 crc kubenswrapper[4749]: I0320 07:27:47.214428 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-2zlqs_ff4ae6b4-eebc-4a32-b390-ec7ea70c8841/console/0.log" Mar 20 07:27:47 crc kubenswrapper[4749]: I0320 07:27:47.214480 4749 generic.go:334] "Generic (PLEG): container finished" podID="ff4ae6b4-eebc-4a32-b390-ec7ea70c8841" containerID="bd50e4fa543b07ebf9d0f5e3db4b3c9481c8d248ce432c831dcae0628425657d" exitCode=2 Mar 20 07:27:47 crc kubenswrapper[4749]: I0320 07:27:47.214512 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2zlqs" event={"ID":"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841","Type":"ContainerDied","Data":"bd50e4fa543b07ebf9d0f5e3db4b3c9481c8d248ce432c831dcae0628425657d"} Mar 20 07:27:47 crc kubenswrapper[4749]: I0320 07:27:47.214545 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2zlqs" event={"ID":"ff4ae6b4-eebc-4a32-b390-ec7ea70c8841","Type":"ContainerDied","Data":"811b2e6891f9edbb93ed07eb48423a089213a55c90c90572e9b85dcb977db4ba"} Mar 20 07:27:47 crc kubenswrapper[4749]: I0320 07:27:47.214563 4749 scope.go:117] "RemoveContainer" containerID="bd50e4fa543b07ebf9d0f5e3db4b3c9481c8d248ce432c831dcae0628425657d" Mar 20 07:27:47 crc kubenswrapper[4749]: I0320 07:27:47.214559 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2zlqs" Mar 20 07:27:47 crc kubenswrapper[4749]: I0320 07:27:47.237034 4749 scope.go:117] "RemoveContainer" containerID="bd50e4fa543b07ebf9d0f5e3db4b3c9481c8d248ce432c831dcae0628425657d" Mar 20 07:27:47 crc kubenswrapper[4749]: E0320 07:27:47.238071 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd50e4fa543b07ebf9d0f5e3db4b3c9481c8d248ce432c831dcae0628425657d\": container with ID starting with bd50e4fa543b07ebf9d0f5e3db4b3c9481c8d248ce432c831dcae0628425657d not found: ID does not exist" containerID="bd50e4fa543b07ebf9d0f5e3db4b3c9481c8d248ce432c831dcae0628425657d" Mar 20 07:27:47 crc kubenswrapper[4749]: I0320 07:27:47.238128 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd50e4fa543b07ebf9d0f5e3db4b3c9481c8d248ce432c831dcae0628425657d"} err="failed to get container status \"bd50e4fa543b07ebf9d0f5e3db4b3c9481c8d248ce432c831dcae0628425657d\": rpc error: code = NotFound desc = could not find container \"bd50e4fa543b07ebf9d0f5e3db4b3c9481c8d248ce432c831dcae0628425657d\": container with ID starting with bd50e4fa543b07ebf9d0f5e3db4b3c9481c8d248ce432c831dcae0628425657d not found: ID does not exist" Mar 20 07:27:47 crc kubenswrapper[4749]: I0320 07:27:47.255084 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-2zlqs"] Mar 20 07:27:47 crc kubenswrapper[4749]: I0320 07:27:47.259997 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-2zlqs"] Mar 20 07:27:47 crc kubenswrapper[4749]: I0320 07:27:47.363733 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9"] Mar 20 07:27:47 crc kubenswrapper[4749]: W0320 07:27:47.368792 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d7f85fc_7895_4f42_8cc8_587c5f7f0f21.slice/crio-f4f3f761f22584123aa87dbbac733652a80a7997c346d174344f4340a44e78cd WatchSource:0}: Error finding container f4f3f761f22584123aa87dbbac733652a80a7997c346d174344f4340a44e78cd: Status 404 returned error can't find the container with id f4f3f761f22584123aa87dbbac733652a80a7997c346d174344f4340a44e78cd Mar 20 07:27:48 crc kubenswrapper[4749]: I0320 07:27:48.193128 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff4ae6b4-eebc-4a32-b390-ec7ea70c8841" path="/var/lib/kubelet/pods/ff4ae6b4-eebc-4a32-b390-ec7ea70c8841/volumes" Mar 20 07:27:48 crc kubenswrapper[4749]: I0320 07:27:48.222268 4749 generic.go:334] "Generic (PLEG): container finished" podID="0d7f85fc-7895-4f42-8cc8-587c5f7f0f21" containerID="a9afc52caa43a9a55a59362b9169c3f6721c8aac373df3d44402407e5f0be635" exitCode=0 Mar 20 07:27:48 crc kubenswrapper[4749]: I0320 07:27:48.222370 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9" event={"ID":"0d7f85fc-7895-4f42-8cc8-587c5f7f0f21","Type":"ContainerDied","Data":"a9afc52caa43a9a55a59362b9169c3f6721c8aac373df3d44402407e5f0be635"} Mar 20 07:27:48 crc kubenswrapper[4749]: I0320 07:27:48.222401 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9" event={"ID":"0d7f85fc-7895-4f42-8cc8-587c5f7f0f21","Type":"ContainerStarted","Data":"f4f3f761f22584123aa87dbbac733652a80a7997c346d174344f4340a44e78cd"} Mar 20 07:27:51 crc kubenswrapper[4749]: I0320 07:27:51.255209 4749 generic.go:334] "Generic (PLEG): container finished" podID="0d7f85fc-7895-4f42-8cc8-587c5f7f0f21" containerID="ff298df04ce35b2ae5b5a85cd1dfd0f602711e1f44d665df1f7c41b089641023" exitCode=0 Mar 20 07:27:51 crc kubenswrapper[4749]: I0320 07:27:51.255252 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9" event={"ID":"0d7f85fc-7895-4f42-8cc8-587c5f7f0f21","Type":"ContainerDied","Data":"ff298df04ce35b2ae5b5a85cd1dfd0f602711e1f44d665df1f7c41b089641023"} Mar 20 07:27:52 crc kubenswrapper[4749]: I0320 07:27:52.268444 4749 generic.go:334] "Generic (PLEG): container finished" podID="0d7f85fc-7895-4f42-8cc8-587c5f7f0f21" containerID="ea32f1e6f20bdb07fa9df44e341483ae61309200dd66c59b0a16330a364326d0" exitCode=0 Mar 20 07:27:52 crc kubenswrapper[4749]: I0320 07:27:52.268518 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9" event={"ID":"0d7f85fc-7895-4f42-8cc8-587c5f7f0f21","Type":"ContainerDied","Data":"ea32f1e6f20bdb07fa9df44e341483ae61309200dd66c59b0a16330a364326d0"} Mar 20 07:27:53 crc kubenswrapper[4749]: I0320 07:27:53.615862 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9" Mar 20 07:27:53 crc kubenswrapper[4749]: I0320 07:27:53.724881 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0d7f85fc-7895-4f42-8cc8-587c5f7f0f21-util\") pod \"0d7f85fc-7895-4f42-8cc8-587c5f7f0f21\" (UID: \"0d7f85fc-7895-4f42-8cc8-587c5f7f0f21\") " Mar 20 07:27:53 crc kubenswrapper[4749]: I0320 07:27:53.725131 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbk95\" (UniqueName: \"kubernetes.io/projected/0d7f85fc-7895-4f42-8cc8-587c5f7f0f21-kube-api-access-lbk95\") pod \"0d7f85fc-7895-4f42-8cc8-587c5f7f0f21\" (UID: \"0d7f85fc-7895-4f42-8cc8-587c5f7f0f21\") " Mar 20 07:27:53 crc kubenswrapper[4749]: I0320 07:27:53.725164 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0d7f85fc-7895-4f42-8cc8-587c5f7f0f21-bundle\") pod \"0d7f85fc-7895-4f42-8cc8-587c5f7f0f21\" (UID: \"0d7f85fc-7895-4f42-8cc8-587c5f7f0f21\") " Mar 20 07:27:53 crc kubenswrapper[4749]: I0320 07:27:53.726928 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d7f85fc-7895-4f42-8cc8-587c5f7f0f21-bundle" (OuterVolumeSpecName: "bundle") pod "0d7f85fc-7895-4f42-8cc8-587c5f7f0f21" (UID: "0d7f85fc-7895-4f42-8cc8-587c5f7f0f21"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:27:53 crc kubenswrapper[4749]: I0320 07:27:53.730464 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d7f85fc-7895-4f42-8cc8-587c5f7f0f21-kube-api-access-lbk95" (OuterVolumeSpecName: "kube-api-access-lbk95") pod "0d7f85fc-7895-4f42-8cc8-587c5f7f0f21" (UID: "0d7f85fc-7895-4f42-8cc8-587c5f7f0f21"). InnerVolumeSpecName "kube-api-access-lbk95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:27:53 crc kubenswrapper[4749]: I0320 07:27:53.735950 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d7f85fc-7895-4f42-8cc8-587c5f7f0f21-util" (OuterVolumeSpecName: "util") pod "0d7f85fc-7895-4f42-8cc8-587c5f7f0f21" (UID: "0d7f85fc-7895-4f42-8cc8-587c5f7f0f21"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:27:53 crc kubenswrapper[4749]: I0320 07:27:53.826376 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0d7f85fc-7895-4f42-8cc8-587c5f7f0f21-util\") on node \"crc\" DevicePath \"\"" Mar 20 07:27:53 crc kubenswrapper[4749]: I0320 07:27:53.826412 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbk95\" (UniqueName: \"kubernetes.io/projected/0d7f85fc-7895-4f42-8cc8-587c5f7f0f21-kube-api-access-lbk95\") on node \"crc\" DevicePath \"\"" Mar 20 07:27:53 crc kubenswrapper[4749]: I0320 07:27:53.826427 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0d7f85fc-7895-4f42-8cc8-587c5f7f0f21-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:27:54 crc kubenswrapper[4749]: I0320 07:27:54.286521 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9" event={"ID":"0d7f85fc-7895-4f42-8cc8-587c5f7f0f21","Type":"ContainerDied","Data":"f4f3f761f22584123aa87dbbac733652a80a7997c346d174344f4340a44e78cd"} Mar 20 07:27:54 crc kubenswrapper[4749]: I0320 07:27:54.286567 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4f3f761f22584123aa87dbbac733652a80a7997c346d174344f4340a44e78cd" Mar 20 07:27:54 crc kubenswrapper[4749]: I0320 07:27:54.286757 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9" Mar 20 07:28:00 crc kubenswrapper[4749]: I0320 07:28:00.186122 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566528-wd9jp"] Mar 20 07:28:00 crc kubenswrapper[4749]: E0320 07:28:00.186974 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d7f85fc-7895-4f42-8cc8-587c5f7f0f21" containerName="util" Mar 20 07:28:00 crc kubenswrapper[4749]: I0320 07:28:00.186993 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d7f85fc-7895-4f42-8cc8-587c5f7f0f21" containerName="util" Mar 20 07:28:00 crc kubenswrapper[4749]: E0320 07:28:00.187018 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d7f85fc-7895-4f42-8cc8-587c5f7f0f21" containerName="extract" Mar 20 07:28:00 crc kubenswrapper[4749]: I0320 07:28:00.187030 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d7f85fc-7895-4f42-8cc8-587c5f7f0f21" containerName="extract" Mar 20 07:28:00 crc kubenswrapper[4749]: E0320 07:28:00.187057 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d7f85fc-7895-4f42-8cc8-587c5f7f0f21" containerName="pull" Mar 20 07:28:00 crc kubenswrapper[4749]: I0320 07:28:00.187069 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d7f85fc-7895-4f42-8cc8-587c5f7f0f21" containerName="pull" Mar 20 07:28:00 crc kubenswrapper[4749]: I0320 07:28:00.187241 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d7f85fc-7895-4f42-8cc8-587c5f7f0f21" containerName="extract" Mar 20 07:28:00 crc kubenswrapper[4749]: I0320 07:28:00.187851 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566528-wd9jp" Mar 20 07:28:00 crc kubenswrapper[4749]: I0320 07:28:00.189904 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:28:00 crc kubenswrapper[4749]: I0320 07:28:00.190443 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 07:28:00 crc kubenswrapper[4749]: I0320 07:28:00.192189 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566528-wd9jp"] Mar 20 07:28:00 crc kubenswrapper[4749]: I0320 07:28:00.193258 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:28:00 crc kubenswrapper[4749]: I0320 07:28:00.211857 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vqhz\" (UniqueName: \"kubernetes.io/projected/3d1f5c84-c36f-40ba-b778-11bacadeb004-kube-api-access-8vqhz\") pod \"auto-csr-approver-29566528-wd9jp\" (UID: \"3d1f5c84-c36f-40ba-b778-11bacadeb004\") " pod="openshift-infra/auto-csr-approver-29566528-wd9jp" Mar 20 07:28:00 crc kubenswrapper[4749]: I0320 07:28:00.312990 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vqhz\" (UniqueName: \"kubernetes.io/projected/3d1f5c84-c36f-40ba-b778-11bacadeb004-kube-api-access-8vqhz\") pod \"auto-csr-approver-29566528-wd9jp\" (UID: \"3d1f5c84-c36f-40ba-b778-11bacadeb004\") " pod="openshift-infra/auto-csr-approver-29566528-wd9jp" Mar 20 07:28:00 crc kubenswrapper[4749]: I0320 07:28:00.334364 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vqhz\" (UniqueName: \"kubernetes.io/projected/3d1f5c84-c36f-40ba-b778-11bacadeb004-kube-api-access-8vqhz\") pod \"auto-csr-approver-29566528-wd9jp\" (UID: \"3d1f5c84-c36f-40ba-b778-11bacadeb004\") " pod="openshift-infra/auto-csr-approver-29566528-wd9jp" Mar 20 07:28:00 crc kubenswrapper[4749]: I0320 07:28:00.511856 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566528-wd9jp" Mar 20 07:28:00 crc kubenswrapper[4749]: I0320 07:28:00.964709 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566528-wd9jp"] Mar 20 07:28:01 crc kubenswrapper[4749]: I0320 07:28:01.332922 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566528-wd9jp" event={"ID":"3d1f5c84-c36f-40ba-b778-11bacadeb004","Type":"ContainerStarted","Data":"aec32c1a18ac9e01bcb58089277cf05fa624470621fd75944bccee9e62e4358b"} Mar 20 07:28:02 crc kubenswrapper[4749]: I0320 07:28:02.152353 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mx87z"] Mar 20 07:28:02 crc kubenswrapper[4749]: I0320 07:28:02.153890 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mx87z" Mar 20 07:28:02 crc kubenswrapper[4749]: I0320 07:28:02.159916 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mx87z"] Mar 20 07:28:02 crc kubenswrapper[4749]: I0320 07:28:02.340204 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/062c83dc-bea8-474f-86b5-3f91b2645a6b-catalog-content\") pod \"redhat-marketplace-mx87z\" (UID: \"062c83dc-bea8-474f-86b5-3f91b2645a6b\") " pod="openshift-marketplace/redhat-marketplace-mx87z" Mar 20 07:28:02 crc kubenswrapper[4749]: I0320 07:28:02.340249 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgdwn\" (UniqueName: \"kubernetes.io/projected/062c83dc-bea8-474f-86b5-3f91b2645a6b-kube-api-access-sgdwn\") pod \"redhat-marketplace-mx87z\" (UID: \"062c83dc-bea8-474f-86b5-3f91b2645a6b\") " pod="openshift-marketplace/redhat-marketplace-mx87z" Mar 20 07:28:02 crc kubenswrapper[4749]: I0320 07:28:02.340483 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/062c83dc-bea8-474f-86b5-3f91b2645a6b-utilities\") pod \"redhat-marketplace-mx87z\" (UID: \"062c83dc-bea8-474f-86b5-3f91b2645a6b\") " pod="openshift-marketplace/redhat-marketplace-mx87z" Mar 20 07:28:02 crc kubenswrapper[4749]: I0320 07:28:02.343267 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566528-wd9jp" event={"ID":"3d1f5c84-c36f-40ba-b778-11bacadeb004","Type":"ContainerStarted","Data":"2170aeb79ac4e0052bd067379b6b4bdabf45574632953de8ad9b56e8e580441d"} Mar 20 07:28:02 crc kubenswrapper[4749]: I0320 07:28:02.441740 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/062c83dc-bea8-474f-86b5-3f91b2645a6b-utilities\") pod \"redhat-marketplace-mx87z\" (UID: \"062c83dc-bea8-474f-86b5-3f91b2645a6b\") " pod="openshift-marketplace/redhat-marketplace-mx87z" Mar 20 07:28:02 crc kubenswrapper[4749]: I0320 07:28:02.442008 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/062c83dc-bea8-474f-86b5-3f91b2645a6b-catalog-content\") pod \"redhat-marketplace-mx87z\" (UID: \"062c83dc-bea8-474f-86b5-3f91b2645a6b\") " pod="openshift-marketplace/redhat-marketplace-mx87z" Mar 20 07:28:02 crc kubenswrapper[4749]: I0320 07:28:02.442106 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgdwn\" (UniqueName: \"kubernetes.io/projected/062c83dc-bea8-474f-86b5-3f91b2645a6b-kube-api-access-sgdwn\") pod \"redhat-marketplace-mx87z\" (UID: \"062c83dc-bea8-474f-86b5-3f91b2645a6b\") " pod="openshift-marketplace/redhat-marketplace-mx87z" Mar 20 07:28:02 crc kubenswrapper[4749]: I0320 07:28:02.442197 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/062c83dc-bea8-474f-86b5-3f91b2645a6b-utilities\") pod \"redhat-marketplace-mx87z\" (UID: \"062c83dc-bea8-474f-86b5-3f91b2645a6b\") " pod="openshift-marketplace/redhat-marketplace-mx87z" Mar 20 07:28:02 crc kubenswrapper[4749]: I0320 07:28:02.442559 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/062c83dc-bea8-474f-86b5-3f91b2645a6b-catalog-content\") pod \"redhat-marketplace-mx87z\" (UID: \"062c83dc-bea8-474f-86b5-3f91b2645a6b\") " pod="openshift-marketplace/redhat-marketplace-mx87z" Mar 20 07:28:02 crc kubenswrapper[4749]: I0320 07:28:02.465649 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgdwn\" (UniqueName: \"kubernetes.io/projected/062c83dc-bea8-474f-86b5-3f91b2645a6b-kube-api-access-sgdwn\") pod \"redhat-marketplace-mx87z\" (UID: \"062c83dc-bea8-474f-86b5-3f91b2645a6b\") " pod="openshift-marketplace/redhat-marketplace-mx87z" Mar 20 07:28:02 crc kubenswrapper[4749]: I0320 07:28:02.475650 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mx87z" Mar 20 07:28:02 crc kubenswrapper[4749]: I0320 07:28:02.881928 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566528-wd9jp" podStartSLOduration=1.855962975 podStartE2EDuration="2.881913446s" podCreationTimestamp="2026-03-20 07:28:00 +0000 UTC" firstStartedPulling="2026-03-20 07:28:00.970941978 +0000 UTC m=+917.520599625" lastFinishedPulling="2026-03-20 07:28:01.996892419 +0000 UTC m=+918.546550096" observedRunningTime="2026-03-20 07:28:02.357407238 +0000 UTC m=+918.907064905" watchObservedRunningTime="2026-03-20 07:28:02.881913446 +0000 UTC m=+919.431571093" Mar 20 07:28:02 crc kubenswrapper[4749]: I0320 07:28:02.883492 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mx87z"] Mar 20 07:28:02 crc kubenswrapper[4749]: W0320 07:28:02.894241 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod062c83dc_bea8_474f_86b5_3f91b2645a6b.slice/crio-bdd6ac276ff3a0a7ea0f1f5b579f05526930fe5d664d2384a8b42d91ea0f0fc1 WatchSource:0}: Error finding container bdd6ac276ff3a0a7ea0f1f5b579f05526930fe5d664d2384a8b42d91ea0f0fc1: Status 404 returned error can't find the container with id bdd6ac276ff3a0a7ea0f1f5b579f05526930fe5d664d2384a8b42d91ea0f0fc1 Mar 20 07:28:03 crc kubenswrapper[4749]: I0320 07:28:03.355399 4749 generic.go:334] "Generic (PLEG): container finished" podID="062c83dc-bea8-474f-86b5-3f91b2645a6b" containerID="61d46edba2d68fa394fece7f14b91d904ff69cc11088db7685d1a0a7a36e12fd" exitCode=0 Mar 20 07:28:03 crc kubenswrapper[4749]: I0320 07:28:03.355570 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mx87z" event={"ID":"062c83dc-bea8-474f-86b5-3f91b2645a6b","Type":"ContainerDied","Data":"61d46edba2d68fa394fece7f14b91d904ff69cc11088db7685d1a0a7a36e12fd"} Mar 20 07:28:03 crc kubenswrapper[4749]: I0320 07:28:03.355867 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mx87z" event={"ID":"062c83dc-bea8-474f-86b5-3f91b2645a6b","Type":"ContainerStarted","Data":"bdd6ac276ff3a0a7ea0f1f5b579f05526930fe5d664d2384a8b42d91ea0f0fc1"} Mar 20 07:28:03 crc kubenswrapper[4749]: I0320 07:28:03.373989 4749 generic.go:334] "Generic (PLEG): container finished" podID="3d1f5c84-c36f-40ba-b778-11bacadeb004" containerID="2170aeb79ac4e0052bd067379b6b4bdabf45574632953de8ad9b56e8e580441d" exitCode=0 Mar 20 07:28:03 crc kubenswrapper[4749]: I0320 07:28:03.374034 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566528-wd9jp" event={"ID":"3d1f5c84-c36f-40ba-b778-11bacadeb004","Type":"ContainerDied","Data":"2170aeb79ac4e0052bd067379b6b4bdabf45574632953de8ad9b56e8e580441d"} Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.380497 4749 generic.go:334] "Generic (PLEG): container finished" podID="062c83dc-bea8-474f-86b5-3f91b2645a6b" containerID="509ec8723e9dff1fdc39d5c91992d96421e080c633ee9f9fdc0eedf27d49bb9c" exitCode=0 Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.380675 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mx87z" event={"ID":"062c83dc-bea8-474f-86b5-3f91b2645a6b","Type":"ContainerDied","Data":"509ec8723e9dff1fdc39d5c91992d96421e080c633ee9f9fdc0eedf27d49bb9c"} Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.717196 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566528-wd9jp" Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.731238 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-789686cc7-8s64f"] Mar 20 07:28:04 crc kubenswrapper[4749]: E0320 07:28:04.731453 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d1f5c84-c36f-40ba-b778-11bacadeb004" containerName="oc" Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.731468 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1f5c84-c36f-40ba-b778-11bacadeb004" containerName="oc" Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.731554 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d1f5c84-c36f-40ba-b778-11bacadeb004" containerName="oc" Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.731901 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-789686cc7-8s64f" Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.735082 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.735107 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-l9lm9" Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.735331 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.742732 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.743132 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.760159 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-789686cc7-8s64f"] Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.873838 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vqhz\" (UniqueName: \"kubernetes.io/projected/3d1f5c84-c36f-40ba-b778-11bacadeb004-kube-api-access-8vqhz\") pod \"3d1f5c84-c36f-40ba-b778-11bacadeb004\" (UID: \"3d1f5c84-c36f-40ba-b778-11bacadeb004\") " Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.874108 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2cee17f1-fcc8-4ae8-aafe-d7eebabbe966-webhook-cert\") pod \"metallb-operator-controller-manager-789686cc7-8s64f\" (UID: \"2cee17f1-fcc8-4ae8-aafe-d7eebabbe966\") " pod="metallb-system/metallb-operator-controller-manager-789686cc7-8s64f" Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.874200 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87h8l\" (UniqueName: \"kubernetes.io/projected/2cee17f1-fcc8-4ae8-aafe-d7eebabbe966-kube-api-access-87h8l\") pod \"metallb-operator-controller-manager-789686cc7-8s64f\" (UID: \"2cee17f1-fcc8-4ae8-aafe-d7eebabbe966\") " pod="metallb-system/metallb-operator-controller-manager-789686cc7-8s64f" Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.874304 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2cee17f1-fcc8-4ae8-aafe-d7eebabbe966-apiservice-cert\") pod \"metallb-operator-controller-manager-789686cc7-8s64f\" (UID: \"2cee17f1-fcc8-4ae8-aafe-d7eebabbe966\") " pod="metallb-system/metallb-operator-controller-manager-789686cc7-8s64f" Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.879357 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d1f5c84-c36f-40ba-b778-11bacadeb004-kube-api-access-8vqhz" (OuterVolumeSpecName: "kube-api-access-8vqhz") pod "3d1f5c84-c36f-40ba-b778-11bacadeb004" (UID: "3d1f5c84-c36f-40ba-b778-11bacadeb004"). InnerVolumeSpecName "kube-api-access-8vqhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.972531 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-784cc76666-86nb4"] Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.973148 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-784cc76666-86nb4" Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.975203 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2cee17f1-fcc8-4ae8-aafe-d7eebabbe966-apiservice-cert\") pod \"metallb-operator-controller-manager-789686cc7-8s64f\" (UID: \"2cee17f1-fcc8-4ae8-aafe-d7eebabbe966\") " pod="metallb-system/metallb-operator-controller-manager-789686cc7-8s64f" Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.975454 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2cee17f1-fcc8-4ae8-aafe-d7eebabbe966-webhook-cert\") pod \"metallb-operator-controller-manager-789686cc7-8s64f\" (UID: \"2cee17f1-fcc8-4ae8-aafe-d7eebabbe966\") " pod="metallb-system/metallb-operator-controller-manager-789686cc7-8s64f" Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.975611 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87h8l\" (UniqueName: \"kubernetes.io/projected/2cee17f1-fcc8-4ae8-aafe-d7eebabbe966-kube-api-access-87h8l\") pod \"metallb-operator-controller-manager-789686cc7-8s64f\" (UID: \"2cee17f1-fcc8-4ae8-aafe-d7eebabbe966\") " pod="metallb-system/metallb-operator-controller-manager-789686cc7-8s64f" Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.975742 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vqhz\" (UniqueName: \"kubernetes.io/projected/3d1f5c84-c36f-40ba-b778-11bacadeb004-kube-api-access-8vqhz\") on node \"crc\" DevicePath \"\"" Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.975964 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.976215 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.977165 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-4c8gz" Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.979988 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2cee17f1-fcc8-4ae8-aafe-d7eebabbe966-apiservice-cert\") pod \"metallb-operator-controller-manager-789686cc7-8s64f\" (UID: \"2cee17f1-fcc8-4ae8-aafe-d7eebabbe966\") " pod="metallb-system/metallb-operator-controller-manager-789686cc7-8s64f" Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.980621 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2cee17f1-fcc8-4ae8-aafe-d7eebabbe966-webhook-cert\") pod \"metallb-operator-controller-manager-789686cc7-8s64f\" (UID: \"2cee17f1-fcc8-4ae8-aafe-d7eebabbe966\") " pod="metallb-system/metallb-operator-controller-manager-789686cc7-8s64f" Mar 20 07:28:04 crc kubenswrapper[4749]: I0320 07:28:04.993768 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-784cc76666-86nb4"] Mar 20 07:28:05 crc kubenswrapper[4749]: I0320 07:28:05.006016 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87h8l\" (UniqueName: \"kubernetes.io/projected/2cee17f1-fcc8-4ae8-aafe-d7eebabbe966-kube-api-access-87h8l\") pod \"metallb-operator-controller-manager-789686cc7-8s64f\" (UID: \"2cee17f1-fcc8-4ae8-aafe-d7eebabbe966\") " pod="metallb-system/metallb-operator-controller-manager-789686cc7-8s64f" Mar 20 07:28:05 crc kubenswrapper[4749]: I0320 07:28:05.058555 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-789686cc7-8s64f" Mar 20 07:28:05 crc kubenswrapper[4749]: I0320 07:28:05.077190 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5201be3d-1ed8-4536-a235-1adc0203c10e-webhook-cert\") pod \"metallb-operator-webhook-server-784cc76666-86nb4\" (UID: \"5201be3d-1ed8-4536-a235-1adc0203c10e\") " pod="metallb-system/metallb-operator-webhook-server-784cc76666-86nb4" Mar 20 07:28:05 crc kubenswrapper[4749]: I0320 07:28:05.077242 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzbcv\" (UniqueName: \"kubernetes.io/projected/5201be3d-1ed8-4536-a235-1adc0203c10e-kube-api-access-vzbcv\") pod \"metallb-operator-webhook-server-784cc76666-86nb4\" (UID: \"5201be3d-1ed8-4536-a235-1adc0203c10e\") " pod="metallb-system/metallb-operator-webhook-server-784cc76666-86nb4" Mar 20 07:28:05 crc kubenswrapper[4749]: I0320 07:28:05.077264 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5201be3d-1ed8-4536-a235-1adc0203c10e-apiservice-cert\") pod \"metallb-operator-webhook-server-784cc76666-86nb4\" (UID: \"5201be3d-1ed8-4536-a235-1adc0203c10e\") " pod="metallb-system/metallb-operator-webhook-server-784cc76666-86nb4" Mar 20 07:28:05 crc kubenswrapper[4749]: I0320 07:28:05.178634 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5201be3d-1ed8-4536-a235-1adc0203c10e-webhook-cert\") pod \"metallb-operator-webhook-server-784cc76666-86nb4\" (UID: \"5201be3d-1ed8-4536-a235-1adc0203c10e\") " pod="metallb-system/metallb-operator-webhook-server-784cc76666-86nb4" Mar 20 07:28:05 crc kubenswrapper[4749]: I0320 07:28:05.178953 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzbcv\" (UniqueName: \"kubernetes.io/projected/5201be3d-1ed8-4536-a235-1adc0203c10e-kube-api-access-vzbcv\") pod \"metallb-operator-webhook-server-784cc76666-86nb4\" (UID: \"5201be3d-1ed8-4536-a235-1adc0203c10e\") " pod="metallb-system/metallb-operator-webhook-server-784cc76666-86nb4" Mar 20 07:28:05 crc kubenswrapper[4749]: I0320 07:28:05.178970 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5201be3d-1ed8-4536-a235-1adc0203c10e-apiservice-cert\") pod \"metallb-operator-webhook-server-784cc76666-86nb4\" (UID: \"5201be3d-1ed8-4536-a235-1adc0203c10e\") " pod="metallb-system/metallb-operator-webhook-server-784cc76666-86nb4" Mar 20 07:28:05 crc kubenswrapper[4749]: I0320 07:28:05.182153 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5201be3d-1ed8-4536-a235-1adc0203c10e-apiservice-cert\") pod \"metallb-operator-webhook-server-784cc76666-86nb4\" (UID: \"5201be3d-1ed8-4536-a235-1adc0203c10e\") " pod="metallb-system/metallb-operator-webhook-server-784cc76666-86nb4" Mar 20 07:28:05 crc kubenswrapper[4749]: I0320 07:28:05.182835 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5201be3d-1ed8-4536-a235-1adc0203c10e-webhook-cert\") pod \"metallb-operator-webhook-server-784cc76666-86nb4\" (UID: \"5201be3d-1ed8-4536-a235-1adc0203c10e\") " pod="metallb-system/metallb-operator-webhook-server-784cc76666-86nb4" Mar 20 07:28:05 crc kubenswrapper[4749]: I0320 07:28:05.198443 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzbcv\" (UniqueName: \"kubernetes.io/projected/5201be3d-1ed8-4536-a235-1adc0203c10e-kube-api-access-vzbcv\") pod \"metallb-operator-webhook-server-784cc76666-86nb4\" (UID: \"5201be3d-1ed8-4536-a235-1adc0203c10e\") " pod="metallb-system/metallb-operator-webhook-server-784cc76666-86nb4" Mar 20 07:28:05 crc kubenswrapper[4749]: I0320 07:28:05.277865 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-789686cc7-8s64f"] Mar 20 07:28:05 crc kubenswrapper[4749]: W0320 07:28:05.278226 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cee17f1_fcc8_4ae8_aafe_d7eebabbe966.slice/crio-db7dd9a1e88233064d92ad785eeb0afc32d1f72aa01ca3593c63938a815dfd13 WatchSource:0}: Error finding container db7dd9a1e88233064d92ad785eeb0afc32d1f72aa01ca3593c63938a815dfd13: Status 404 returned error can't find the container with id db7dd9a1e88233064d92ad785eeb0afc32d1f72aa01ca3593c63938a815dfd13 Mar 20 07:28:05 crc kubenswrapper[4749]: I0320 07:28:05.312467 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-784cc76666-86nb4" Mar 20 07:28:05 crc kubenswrapper[4749]: I0320 07:28:05.398612 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mx87z" event={"ID":"062c83dc-bea8-474f-86b5-3f91b2645a6b","Type":"ContainerStarted","Data":"ac83592ed70f69c56abe24160d8829ea48fff89bea23886291c860d5e55d9d24"} Mar 20 07:28:05 crc kubenswrapper[4749]: I0320 07:28:05.406674 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566528-wd9jp" Mar 20 07:28:05 crc kubenswrapper[4749]: I0320 07:28:05.406673 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566528-wd9jp" event={"ID":"3d1f5c84-c36f-40ba-b778-11bacadeb004","Type":"ContainerDied","Data":"aec32c1a18ac9e01bcb58089277cf05fa624470621fd75944bccee9e62e4358b"} Mar 20 07:28:05 crc kubenswrapper[4749]: I0320 07:28:05.406784 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aec32c1a18ac9e01bcb58089277cf05fa624470621fd75944bccee9e62e4358b" Mar 20 07:28:05 crc kubenswrapper[4749]: I0320 07:28:05.425107 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-789686cc7-8s64f" event={"ID":"2cee17f1-fcc8-4ae8-aafe-d7eebabbe966","Type":"ContainerStarted","Data":"db7dd9a1e88233064d92ad785eeb0afc32d1f72aa01ca3593c63938a815dfd13"} Mar 20 07:28:05 crc kubenswrapper[4749]: I0320 07:28:05.450734 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mx87z" podStartSLOduration=1.940073345 podStartE2EDuration="3.450719675s" podCreationTimestamp="2026-03-20 07:28:02 +0000 UTC" firstStartedPulling="2026-03-20 07:28:03.357424552 +0000 UTC m=+919.907082209" lastFinishedPulling="2026-03-20 07:28:04.868070892 +0000 UTC m=+921.417728539" observedRunningTime="2026-03-20 07:28:05.450215123 +0000 UTC m=+921.999872770" watchObservedRunningTime="2026-03-20 07:28:05.450719675 +0000 UTC m=+922.000377322" Mar 20 07:28:05 crc kubenswrapper[4749]: I0320 07:28:05.460191 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566522-f2md2"] Mar 20 07:28:05 crc kubenswrapper[4749]: I0320 07:28:05.465015 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566522-f2md2"] Mar 20 07:28:05 crc kubenswrapper[4749]: I0320 07:28:05.606396 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-784cc76666-86nb4"] Mar 20 07:28:05 crc kubenswrapper[4749]: W0320 07:28:05.612116 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5201be3d_1ed8_4536_a235_1adc0203c10e.slice/crio-c31a48ec870a219e5d0d2e4ddb587a8cea5ab16d6b04c7db8867c3134d00cd7f WatchSource:0}: Error finding container c31a48ec870a219e5d0d2e4ddb587a8cea5ab16d6b04c7db8867c3134d00cd7f: Status 404 returned error can't find the container with id c31a48ec870a219e5d0d2e4ddb587a8cea5ab16d6b04c7db8867c3134d00cd7f Mar 20 07:28:06 crc kubenswrapper[4749]: I0320 07:28:06.184661 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80a300a4-78f9-4407-af0b-60e66f310b87" path="/var/lib/kubelet/pods/80a300a4-78f9-4407-af0b-60e66f310b87/volumes" Mar 20 07:28:06 crc kubenswrapper[4749]: I0320 07:28:06.434076 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-784cc76666-86nb4" event={"ID":"5201be3d-1ed8-4536-a235-1adc0203c10e","Type":"ContainerStarted","Data":"c31a48ec870a219e5d0d2e4ddb587a8cea5ab16d6b04c7db8867c3134d00cd7f"} Mar 20 07:28:09 crc kubenswrapper[4749]: I0320 07:28:09.469815 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-789686cc7-8s64f" event={"ID":"2cee17f1-fcc8-4ae8-aafe-d7eebabbe966","Type":"ContainerStarted","Data":"40c78b40b30ecbb5858770e941d2419d88d2012d934a61fb0ff7b2becab718cf"} Mar 20 07:28:09 crc kubenswrapper[4749]: I0320 07:28:09.470370 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-789686cc7-8s64f" Mar 20 07:28:09 crc kubenswrapper[4749]: I0320 07:28:09.499059 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-789686cc7-8s64f" podStartSLOduration=2.159678899 podStartE2EDuration="5.49903776s" podCreationTimestamp="2026-03-20 07:28:04 +0000 UTC" firstStartedPulling="2026-03-20 07:28:05.28076173 +0000 UTC m=+921.830419367" lastFinishedPulling="2026-03-20 07:28:08.620120581 +0000 UTC m=+925.169778228" observedRunningTime="2026-03-20 07:28:09.497530614 +0000 UTC m=+926.047188261" watchObservedRunningTime="2026-03-20 07:28:09.49903776 +0000 UTC m=+926.048695407" Mar 20 07:28:10 crc kubenswrapper[4749]: I0320 07:28:10.478779 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-784cc76666-86nb4" event={"ID":"5201be3d-1ed8-4536-a235-1adc0203c10e","Type":"ContainerStarted","Data":"06429bd66596929dc81cb65115291793a941f2fc2870936fbbfa72dfe9f2cf23"} Mar 20 07:28:10 crc kubenswrapper[4749]: I0320 07:28:10.510360 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-784cc76666-86nb4" podStartSLOduration=1.863355343 podStartE2EDuration="6.510335908s" podCreationTimestamp="2026-03-20 07:28:04 +0000 UTC" firstStartedPulling="2026-03-20 07:28:05.616483699 +0000 UTC m=+922.166141346" lastFinishedPulling="2026-03-20 07:28:10.263464254 +0000 UTC m=+926.813121911" observedRunningTime="2026-03-20 07:28:10.50959803 +0000 UTC m=+927.059255697" watchObservedRunningTime="2026-03-20 07:28:10.510335908 +0000 UTC m=+927.059993575" Mar 20 07:28:11 crc kubenswrapper[4749]: I0320 07:28:11.483405 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-784cc76666-86nb4" Mar 20 07:28:12 crc kubenswrapper[4749]: I0320 07:28:12.476215 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mx87z" Mar 20 07:28:12 crc kubenswrapper[4749]: I0320 07:28:12.476463 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mx87z" Mar 20 07:28:12 crc kubenswrapper[4749]: I0320 07:28:12.539066 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mx87z" Mar 20 07:28:12 crc kubenswrapper[4749]: I0320 07:28:12.588307 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mx87z" Mar 20 07:28:12 crc kubenswrapper[4749]: I0320 07:28:12.767368 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mx87z"] Mar 20 07:28:14 crc kubenswrapper[4749]: I0320 07:28:14.504022 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mx87z" podUID="062c83dc-bea8-474f-86b5-3f91b2645a6b" containerName="registry-server" containerID="cri-o://ac83592ed70f69c56abe24160d8829ea48fff89bea23886291c860d5e55d9d24" gracePeriod=2 Mar 20 07:28:14 crc kubenswrapper[4749]: I0320 07:28:14.882560 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mx87z" Mar 20 07:28:15 crc kubenswrapper[4749]: I0320 07:28:15.009992 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgdwn\" (UniqueName: \"kubernetes.io/projected/062c83dc-bea8-474f-86b5-3f91b2645a6b-kube-api-access-sgdwn\") pod \"062c83dc-bea8-474f-86b5-3f91b2645a6b\" (UID: \"062c83dc-bea8-474f-86b5-3f91b2645a6b\") " Mar 20 07:28:15 crc kubenswrapper[4749]: I0320 07:28:15.010044 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/062c83dc-bea8-474f-86b5-3f91b2645a6b-catalog-content\") pod \"062c83dc-bea8-474f-86b5-3f91b2645a6b\" (UID: \"062c83dc-bea8-474f-86b5-3f91b2645a6b\") " Mar 20 07:28:15 crc kubenswrapper[4749]: I0320 07:28:15.010070 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/062c83dc-bea8-474f-86b5-3f91b2645a6b-utilities\") pod \"062c83dc-bea8-474f-86b5-3f91b2645a6b\" (UID: \"062c83dc-bea8-474f-86b5-3f91b2645a6b\") " Mar 20 07:28:15 crc kubenswrapper[4749]: I0320 07:28:15.011132 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/062c83dc-bea8-474f-86b5-3f91b2645a6b-utilities" (OuterVolumeSpecName: "utilities") pod "062c83dc-bea8-474f-86b5-3f91b2645a6b" (UID: "062c83dc-bea8-474f-86b5-3f91b2645a6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:28:15 crc kubenswrapper[4749]: I0320 07:28:15.019466 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/062c83dc-bea8-474f-86b5-3f91b2645a6b-kube-api-access-sgdwn" (OuterVolumeSpecName: "kube-api-access-sgdwn") pod "062c83dc-bea8-474f-86b5-3f91b2645a6b" (UID: "062c83dc-bea8-474f-86b5-3f91b2645a6b"). InnerVolumeSpecName "kube-api-access-sgdwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:28:15 crc kubenswrapper[4749]: I0320 07:28:15.045210 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/062c83dc-bea8-474f-86b5-3f91b2645a6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "062c83dc-bea8-474f-86b5-3f91b2645a6b" (UID: "062c83dc-bea8-474f-86b5-3f91b2645a6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:28:15 crc kubenswrapper[4749]: I0320 07:28:15.110944 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgdwn\" (UniqueName: \"kubernetes.io/projected/062c83dc-bea8-474f-86b5-3f91b2645a6b-kube-api-access-sgdwn\") on node \"crc\" DevicePath \"\"" Mar 20 07:28:15 crc kubenswrapper[4749]: I0320 07:28:15.110976 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/062c83dc-bea8-474f-86b5-3f91b2645a6b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:28:15 crc kubenswrapper[4749]: I0320 07:28:15.110987 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/062c83dc-bea8-474f-86b5-3f91b2645a6b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:28:15 crc kubenswrapper[4749]: I0320 07:28:15.515379 4749 generic.go:334] "Generic (PLEG): container finished" podID="062c83dc-bea8-474f-86b5-3f91b2645a6b" containerID="ac83592ed70f69c56abe24160d8829ea48fff89bea23886291c860d5e55d9d24" exitCode=0 Mar 20 07:28:15 crc kubenswrapper[4749]: I0320 07:28:15.515440 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mx87z" Mar 20 07:28:15 crc kubenswrapper[4749]: I0320 07:28:15.515484 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mx87z" event={"ID":"062c83dc-bea8-474f-86b5-3f91b2645a6b","Type":"ContainerDied","Data":"ac83592ed70f69c56abe24160d8829ea48fff89bea23886291c860d5e55d9d24"} Mar 20 07:28:15 crc kubenswrapper[4749]: I0320 07:28:15.515535 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mx87z" event={"ID":"062c83dc-bea8-474f-86b5-3f91b2645a6b","Type":"ContainerDied","Data":"bdd6ac276ff3a0a7ea0f1f5b579f05526930fe5d664d2384a8b42d91ea0f0fc1"} Mar 20 07:28:15 crc kubenswrapper[4749]: I0320 07:28:15.515558 4749 scope.go:117] "RemoveContainer" containerID="ac83592ed70f69c56abe24160d8829ea48fff89bea23886291c860d5e55d9d24" Mar 20 07:28:15 crc kubenswrapper[4749]: I0320 07:28:15.548901 4749 scope.go:117] "RemoveContainer" containerID="509ec8723e9dff1fdc39d5c91992d96421e080c633ee9f9fdc0eedf27d49bb9c" Mar 20 07:28:15 crc kubenswrapper[4749]: I0320 07:28:15.560148 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mx87z"] Mar 20 07:28:15 crc kubenswrapper[4749]: I0320 07:28:15.567208 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mx87z"] Mar 20 07:28:15 crc kubenswrapper[4749]: I0320 07:28:15.571737 4749 scope.go:117] "RemoveContainer" containerID="61d46edba2d68fa394fece7f14b91d904ff69cc11088db7685d1a0a7a36e12fd" Mar 20 07:28:15 crc kubenswrapper[4749]: I0320 07:28:15.597472 4749 scope.go:117] "RemoveContainer" containerID="ac83592ed70f69c56abe24160d8829ea48fff89bea23886291c860d5e55d9d24" Mar 20 07:28:15 crc kubenswrapper[4749]: E0320 07:28:15.597956 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac83592ed70f69c56abe24160d8829ea48fff89bea23886291c860d5e55d9d24\": container with ID starting with ac83592ed70f69c56abe24160d8829ea48fff89bea23886291c860d5e55d9d24 not found: ID does not exist" containerID="ac83592ed70f69c56abe24160d8829ea48fff89bea23886291c860d5e55d9d24" Mar 20 07:28:15 crc kubenswrapper[4749]: I0320 07:28:15.598005 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac83592ed70f69c56abe24160d8829ea48fff89bea23886291c860d5e55d9d24"} err="failed to get container status \"ac83592ed70f69c56abe24160d8829ea48fff89bea23886291c860d5e55d9d24\": rpc error: code = NotFound desc = could not find container \"ac83592ed70f69c56abe24160d8829ea48fff89bea23886291c860d5e55d9d24\": container with ID starting with ac83592ed70f69c56abe24160d8829ea48fff89bea23886291c860d5e55d9d24 not found: ID does not exist" Mar 20 07:28:15 crc kubenswrapper[4749]: I0320 07:28:15.598035 4749 scope.go:117] "RemoveContainer" containerID="509ec8723e9dff1fdc39d5c91992d96421e080c633ee9f9fdc0eedf27d49bb9c" Mar 20 07:28:15 crc kubenswrapper[4749]: E0320 07:28:15.598444 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"509ec8723e9dff1fdc39d5c91992d96421e080c633ee9f9fdc0eedf27d49bb9c\": container with ID starting with 509ec8723e9dff1fdc39d5c91992d96421e080c633ee9f9fdc0eedf27d49bb9c not found: ID does not exist" containerID="509ec8723e9dff1fdc39d5c91992d96421e080c633ee9f9fdc0eedf27d49bb9c" Mar 20 07:28:15 crc kubenswrapper[4749]: I0320 07:28:15.598504 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"509ec8723e9dff1fdc39d5c91992d96421e080c633ee9f9fdc0eedf27d49bb9c"} err="failed to get container status \"509ec8723e9dff1fdc39d5c91992d96421e080c633ee9f9fdc0eedf27d49bb9c\": rpc error: code = NotFound desc = could not find container \"509ec8723e9dff1fdc39d5c91992d96421e080c633ee9f9fdc0eedf27d49bb9c\": container with ID starting with 509ec8723e9dff1fdc39d5c91992d96421e080c633ee9f9fdc0eedf27d49bb9c not found: ID does not exist" Mar 20 07:28:15 crc kubenswrapper[4749]: I0320 07:28:15.598551 4749 scope.go:117] "RemoveContainer" containerID="61d46edba2d68fa394fece7f14b91d904ff69cc11088db7685d1a0a7a36e12fd" Mar 20 07:28:15 crc kubenswrapper[4749]: E0320 07:28:15.598977 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61d46edba2d68fa394fece7f14b91d904ff69cc11088db7685d1a0a7a36e12fd\": container with ID starting with 61d46edba2d68fa394fece7f14b91d904ff69cc11088db7685d1a0a7a36e12fd not found: ID does not exist" containerID="61d46edba2d68fa394fece7f14b91d904ff69cc11088db7685d1a0a7a36e12fd" Mar 20 07:28:15 crc kubenswrapper[4749]: I0320 07:28:15.599015 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61d46edba2d68fa394fece7f14b91d904ff69cc11088db7685d1a0a7a36e12fd"} err="failed to get container status \"61d46edba2d68fa394fece7f14b91d904ff69cc11088db7685d1a0a7a36e12fd\": rpc error: code = NotFound desc = could not find container \"61d46edba2d68fa394fece7f14b91d904ff69cc11088db7685d1a0a7a36e12fd\": container with ID starting with 61d46edba2d68fa394fece7f14b91d904ff69cc11088db7685d1a0a7a36e12fd not found: ID does not exist" Mar 20 07:28:16 crc kubenswrapper[4749]: I0320 07:28:16.183555 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="062c83dc-bea8-474f-86b5-3f91b2645a6b" path="/var/lib/kubelet/pods/062c83dc-bea8-474f-86b5-3f91b2645a6b/volumes" Mar 20 07:28:25 crc kubenswrapper[4749]: I0320 07:28:25.317175 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-784cc76666-86nb4" Mar 20 07:28:34 crc kubenswrapper[4749]: I0320 07:28:34.514606 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:28:34 crc kubenswrapper[4749]: I0320 07:28:34.515163 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:28:45 crc kubenswrapper[4749]: I0320 07:28:45.062336 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-789686cc7-8s64f" Mar 20 07:28:45 crc kubenswrapper[4749]: I0320 07:28:45.905432 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-5ww9w"] Mar 20 07:28:45 crc kubenswrapper[4749]: E0320 07:28:45.905690 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="062c83dc-bea8-474f-86b5-3f91b2645a6b" containerName="extract-content" Mar 20 07:28:45 crc kubenswrapper[4749]: I0320 07:28:45.905704 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="062c83dc-bea8-474f-86b5-3f91b2645a6b" containerName="extract-content" Mar 20 07:28:45 crc kubenswrapper[4749]: E0320 07:28:45.905718 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="062c83dc-bea8-474f-86b5-3f91b2645a6b" containerName="extract-utilities" Mar 20 07:28:45 crc kubenswrapper[4749]: I0320 07:28:45.905725 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="062c83dc-bea8-474f-86b5-3f91b2645a6b" containerName="extract-utilities" Mar 20 07:28:45 crc kubenswrapper[4749]: E0320 07:28:45.905736 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="062c83dc-bea8-474f-86b5-3f91b2645a6b" containerName="registry-server" Mar 20 07:28:45 crc kubenswrapper[4749]: I0320 07:28:45.905744 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="062c83dc-bea8-474f-86b5-3f91b2645a6b" containerName="registry-server" Mar 20 07:28:45 crc kubenswrapper[4749]: I0320 07:28:45.905873 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="062c83dc-bea8-474f-86b5-3f91b2645a6b" containerName="registry-server" Mar 20 07:28:45 crc kubenswrapper[4749]: I0320 07:28:45.908061 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:28:45 crc kubenswrapper[4749]: I0320 07:28:45.908198 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p9db"] Mar 20 07:28:45 crc kubenswrapper[4749]: I0320 07:28:45.908631 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p9db" Mar 20 07:28:45 crc kubenswrapper[4749]: I0320 07:28:45.920843 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 20 07:28:45 crc kubenswrapper[4749]: I0320 07:28:45.921196 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-l5hct" Mar 20 07:28:45 crc kubenswrapper[4749]: I0320 07:28:45.921230 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 20 07:28:45 crc kubenswrapper[4749]: I0320 07:28:45.921063 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 20 07:28:45 crc kubenswrapper[4749]: I0320 07:28:45.941766 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p9db"] Mar 20 07:28:45 crc kubenswrapper[4749]: I0320 07:28:45.971835 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/82ef576a-cd53-4b86-8ddc-e2528fa1b23d-frr-sockets\") pod \"frr-k8s-5ww9w\" (UID: \"82ef576a-cd53-4b86-8ddc-e2528fa1b23d\") " pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:28:45 crc kubenswrapper[4749]: I0320 07:28:45.971904 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/82ef576a-cd53-4b86-8ddc-e2528fa1b23d-metrics\") pod \"frr-k8s-5ww9w\" (UID: \"82ef576a-cd53-4b86-8ddc-e2528fa1b23d\") " pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:28:45 crc kubenswrapper[4749]: I0320 07:28:45.971930 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w77hr\" (UniqueName: \"kubernetes.io/projected/82ef576a-cd53-4b86-8ddc-e2528fa1b23d-kube-api-access-w77hr\") pod \"frr-k8s-5ww9w\" (UID: \"82ef576a-cd53-4b86-8ddc-e2528fa1b23d\") " pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:28:45 crc kubenswrapper[4749]: I0320 07:28:45.971954 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxgdt\" (UniqueName: \"kubernetes.io/projected/b235c243-14bb-4dba-9905-5bd230ae2879-kube-api-access-rxgdt\") pod \"frr-k8s-webhook-server-bcc4b6f68-5p9db\" (UID: \"b235c243-14bb-4dba-9905-5bd230ae2879\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p9db" Mar 20 07:28:45 crc kubenswrapper[4749]: I0320 07:28:45.971969 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/82ef576a-cd53-4b86-8ddc-e2528fa1b23d-frr-startup\") pod \"frr-k8s-5ww9w\" (UID: \"82ef576a-cd53-4b86-8ddc-e2528fa1b23d\") " pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:28:45 crc kubenswrapper[4749]: I0320 07:28:45.971991 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b235c243-14bb-4dba-9905-5bd230ae2879-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-5p9db\" (UID: \"b235c243-14bb-4dba-9905-5bd230ae2879\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p9db" Mar 20 07:28:45 crc kubenswrapper[4749]: I0320 07:28:45.972006 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/82ef576a-cd53-4b86-8ddc-e2528fa1b23d-frr-conf\") pod \"frr-k8s-5ww9w\" (UID: \"82ef576a-cd53-4b86-8ddc-e2528fa1b23d\") " pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:28:45 crc kubenswrapper[4749]: I0320 07:28:45.972026 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/82ef576a-cd53-4b86-8ddc-e2528fa1b23d-reloader\") pod \"frr-k8s-5ww9w\" (UID: \"82ef576a-cd53-4b86-8ddc-e2528fa1b23d\") " pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:28:45 crc kubenswrapper[4749]: I0320 07:28:45.972044 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82ef576a-cd53-4b86-8ddc-e2528fa1b23d-metrics-certs\") pod \"frr-k8s-5ww9w\" (UID: \"82ef576a-cd53-4b86-8ddc-e2528fa1b23d\") " pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.020533 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-s5f42"] Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.024706 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-s5f42" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.026603 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.026877 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.027105 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-mw47k" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.032336 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.044577 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-d7bq6"] Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.045475 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-d7bq6" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.046972 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.059438 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-d7bq6"] Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.072781 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82ef576a-cd53-4b86-8ddc-e2528fa1b23d-metrics-certs\") pod \"frr-k8s-5ww9w\" (UID: \"82ef576a-cd53-4b86-8ddc-e2528fa1b23d\") " pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.072857 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzbpk\" (UniqueName: \"kubernetes.io/projected/b50e6e0c-7ab7-4516-929a-49e48e00c1e2-kube-api-access-kzbpk\") pod \"controller-7bb4cc7c98-d7bq6\" (UID: \"b50e6e0c-7ab7-4516-929a-49e48e00c1e2\") " pod="metallb-system/controller-7bb4cc7c98-d7bq6" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.072887 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/82ef576a-cd53-4b86-8ddc-e2528fa1b23d-frr-sockets\") pod \"frr-k8s-5ww9w\" (UID: \"82ef576a-cd53-4b86-8ddc-e2528fa1b23d\") " pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.072917 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b50e6e0c-7ab7-4516-929a-49e48e00c1e2-cert\") pod \"controller-7bb4cc7c98-d7bq6\" (UID: \"b50e6e0c-7ab7-4516-929a-49e48e00c1e2\") " pod="metallb-system/controller-7bb4cc7c98-d7bq6" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.072958 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/82ef576a-cd53-4b86-8ddc-e2528fa1b23d-metrics\") pod \"frr-k8s-5ww9w\" (UID: \"82ef576a-cd53-4b86-8ddc-e2528fa1b23d\") " pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.072979 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w77hr\" (UniqueName: \"kubernetes.io/projected/82ef576a-cd53-4b86-8ddc-e2528fa1b23d-kube-api-access-w77hr\") pod \"frr-k8s-5ww9w\" (UID: \"82ef576a-cd53-4b86-8ddc-e2528fa1b23d\") " pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.073001 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae069ce2-c4d3-434f-8c66-95a75561bf8b-metrics-certs\") pod \"speaker-s5f42\" (UID: \"ae069ce2-c4d3-434f-8c66-95a75561bf8b\") " pod="metallb-system/speaker-s5f42" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.073021 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ae069ce2-c4d3-434f-8c66-95a75561bf8b-memberlist\") pod \"speaker-s5f42\" (UID: \"ae069ce2-c4d3-434f-8c66-95a75561bf8b\") " pod="metallb-system/speaker-s5f42" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.073048 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxgdt\" (UniqueName: \"kubernetes.io/projected/b235c243-14bb-4dba-9905-5bd230ae2879-kube-api-access-rxgdt\") pod \"frr-k8s-webhook-server-bcc4b6f68-5p9db\" (UID: \"b235c243-14bb-4dba-9905-5bd230ae2879\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p9db" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.073066 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/82ef576a-cd53-4b86-8ddc-e2528fa1b23d-frr-startup\") pod \"frr-k8s-5ww9w\" (UID: \"82ef576a-cd53-4b86-8ddc-e2528fa1b23d\") " pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.073088 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b50e6e0c-7ab7-4516-929a-49e48e00c1e2-metrics-certs\") pod \"controller-7bb4cc7c98-d7bq6\" (UID: \"b50e6e0c-7ab7-4516-929a-49e48e00c1e2\") " pod="metallb-system/controller-7bb4cc7c98-d7bq6" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.073115 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b235c243-14bb-4dba-9905-5bd230ae2879-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-5p9db\" (UID: \"b235c243-14bb-4dba-9905-5bd230ae2879\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p9db" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.073134 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/82ef576a-cd53-4b86-8ddc-e2528fa1b23d-frr-conf\") pod \"frr-k8s-5ww9w\" (UID: \"82ef576a-cd53-4b86-8ddc-e2528fa1b23d\") " pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.073161 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bww7g\" (UniqueName: \"kubernetes.io/projected/ae069ce2-c4d3-434f-8c66-95a75561bf8b-kube-api-access-bww7g\") pod \"speaker-s5f42\" (UID: \"ae069ce2-c4d3-434f-8c66-95a75561bf8b\") " pod="metallb-system/speaker-s5f42" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.073180 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/82ef576a-cd53-4b86-8ddc-e2528fa1b23d-reloader\") pod \"frr-k8s-5ww9w\" (UID: \"82ef576a-cd53-4b86-8ddc-e2528fa1b23d\") " pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.073202 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ae069ce2-c4d3-434f-8c66-95a75561bf8b-metallb-excludel2\") pod \"speaker-s5f42\" (UID: \"ae069ce2-c4d3-434f-8c66-95a75561bf8b\") " pod="metallb-system/speaker-s5f42" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.074515 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/82ef576a-cd53-4b86-8ddc-e2528fa1b23d-frr-sockets\") pod \"frr-k8s-5ww9w\" (UID: \"82ef576a-cd53-4b86-8ddc-e2528fa1b23d\") " pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.074782 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/82ef576a-cd53-4b86-8ddc-e2528fa1b23d-metrics\") pod \"frr-k8s-5ww9w\" (UID: \"82ef576a-cd53-4b86-8ddc-e2528fa1b23d\") " pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.075253 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/82ef576a-cd53-4b86-8ddc-e2528fa1b23d-frr-conf\") pod \"frr-k8s-5ww9w\" (UID: \"82ef576a-cd53-4b86-8ddc-e2528fa1b23d\") " pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.075683 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/82ef576a-cd53-4b86-8ddc-e2528fa1b23d-reloader\") pod \"frr-k8s-5ww9w\" (UID: \"82ef576a-cd53-4b86-8ddc-e2528fa1b23d\") " pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.075816 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/82ef576a-cd53-4b86-8ddc-e2528fa1b23d-frr-startup\") pod \"frr-k8s-5ww9w\" (UID: \"82ef576a-cd53-4b86-8ddc-e2528fa1b23d\") " pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.080327 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b235c243-14bb-4dba-9905-5bd230ae2879-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-5p9db\" (UID: \"b235c243-14bb-4dba-9905-5bd230ae2879\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p9db" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.083771 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82ef576a-cd53-4b86-8ddc-e2528fa1b23d-metrics-certs\") pod \"frr-k8s-5ww9w\" (UID: \"82ef576a-cd53-4b86-8ddc-e2528fa1b23d\") " pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.092426 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxgdt\" (UniqueName: \"kubernetes.io/projected/b235c243-14bb-4dba-9905-5bd230ae2879-kube-api-access-rxgdt\") pod \"frr-k8s-webhook-server-bcc4b6f68-5p9db\" (UID: \"b235c243-14bb-4dba-9905-5bd230ae2879\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p9db" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.093824 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w77hr\" (UniqueName: \"kubernetes.io/projected/82ef576a-cd53-4b86-8ddc-e2528fa1b23d-kube-api-access-w77hr\") pod \"frr-k8s-5ww9w\" (UID: \"82ef576a-cd53-4b86-8ddc-e2528fa1b23d\") " pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.174082 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b50e6e0c-7ab7-4516-929a-49e48e00c1e2-cert\") pod \"controller-7bb4cc7c98-d7bq6\" (UID: \"b50e6e0c-7ab7-4516-929a-49e48e00c1e2\") " pod="metallb-system/controller-7bb4cc7c98-d7bq6" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.174154 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae069ce2-c4d3-434f-8c66-95a75561bf8b-metrics-certs\") pod \"speaker-s5f42\" (UID: \"ae069ce2-c4d3-434f-8c66-95a75561bf8b\") " pod="metallb-system/speaker-s5f42" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.174174 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ae069ce2-c4d3-434f-8c66-95a75561bf8b-memberlist\") pod \"speaker-s5f42\" (UID: \"ae069ce2-c4d3-434f-8c66-95a75561bf8b\") " pod="metallb-system/speaker-s5f42" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.174199 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b50e6e0c-7ab7-4516-929a-49e48e00c1e2-metrics-certs\") pod \"controller-7bb4cc7c98-d7bq6\" (UID: \"b50e6e0c-7ab7-4516-929a-49e48e00c1e2\") " pod="metallb-system/controller-7bb4cc7c98-d7bq6" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.174234 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bww7g\" (UniqueName: \"kubernetes.io/projected/ae069ce2-c4d3-434f-8c66-95a75561bf8b-kube-api-access-bww7g\") pod \"speaker-s5f42\" (UID: \"ae069ce2-c4d3-434f-8c66-95a75561bf8b\") " pod="metallb-system/speaker-s5f42" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.174256 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ae069ce2-c4d3-434f-8c66-95a75561bf8b-metallb-excludel2\") pod \"speaker-s5f42\" (UID: \"ae069ce2-c4d3-434f-8c66-95a75561bf8b\") " pod="metallb-system/speaker-s5f42" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.174321 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzbpk\" (UniqueName: \"kubernetes.io/projected/b50e6e0c-7ab7-4516-929a-49e48e00c1e2-kube-api-access-kzbpk\") pod \"controller-7bb4cc7c98-d7bq6\" (UID: \"b50e6e0c-7ab7-4516-929a-49e48e00c1e2\") " pod="metallb-system/controller-7bb4cc7c98-d7bq6" Mar 20 07:28:46 crc kubenswrapper[4749]: E0320 07:28:46.174551 4749 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 07:28:46 crc kubenswrapper[4749]: E0320 07:28:46.174645 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae069ce2-c4d3-434f-8c66-95a75561bf8b-memberlist podName:ae069ce2-c4d3-434f-8c66-95a75561bf8b nodeName:}" failed. No retries permitted until 2026-03-20 07:28:46.674622471 +0000 UTC m=+963.224280218 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ae069ce2-c4d3-434f-8c66-95a75561bf8b-memberlist") pod "speaker-s5f42" (UID: "ae069ce2-c4d3-434f-8c66-95a75561bf8b") : secret "metallb-memberlist" not found Mar 20 07:28:46 crc kubenswrapper[4749]: E0320 07:28:46.174737 4749 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 20 07:28:46 crc kubenswrapper[4749]: E0320 07:28:46.174799 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b50e6e0c-7ab7-4516-929a-49e48e00c1e2-metrics-certs podName:b50e6e0c-7ab7-4516-929a-49e48e00c1e2 nodeName:}" failed. No retries permitted until 2026-03-20 07:28:46.674777955 +0000 UTC m=+963.224435712 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b50e6e0c-7ab7-4516-929a-49e48e00c1e2-metrics-certs") pod "controller-7bb4cc7c98-d7bq6" (UID: "b50e6e0c-7ab7-4516-929a-49e48e00c1e2") : secret "controller-certs-secret" not found Mar 20 07:28:46 crc kubenswrapper[4749]: E0320 07:28:46.174983 4749 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 20 07:28:46 crc kubenswrapper[4749]: E0320 07:28:46.175025 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae069ce2-c4d3-434f-8c66-95a75561bf8b-metrics-certs podName:ae069ce2-c4d3-434f-8c66-95a75561bf8b nodeName:}" failed. No retries permitted until 2026-03-20 07:28:46.67501535 +0000 UTC m=+963.224673127 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae069ce2-c4d3-434f-8c66-95a75561bf8b-metrics-certs") pod "speaker-s5f42" (UID: "ae069ce2-c4d3-434f-8c66-95a75561bf8b") : secret "speaker-certs-secret" not found Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.175643 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ae069ce2-c4d3-434f-8c66-95a75561bf8b-metallb-excludel2\") pod \"speaker-s5f42\" (UID: \"ae069ce2-c4d3-434f-8c66-95a75561bf8b\") " pod="metallb-system/speaker-s5f42" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.177149 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.188691 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b50e6e0c-7ab7-4516-929a-49e48e00c1e2-cert\") pod \"controller-7bb4cc7c98-d7bq6\" (UID: \"b50e6e0c-7ab7-4516-929a-49e48e00c1e2\") " pod="metallb-system/controller-7bb4cc7c98-d7bq6" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.197428 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bww7g\" (UniqueName: \"kubernetes.io/projected/ae069ce2-c4d3-434f-8c66-95a75561bf8b-kube-api-access-bww7g\") pod \"speaker-s5f42\" (UID: \"ae069ce2-c4d3-434f-8c66-95a75561bf8b\") " pod="metallb-system/speaker-s5f42" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.197855 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzbpk\" (UniqueName: \"kubernetes.io/projected/b50e6e0c-7ab7-4516-929a-49e48e00c1e2-kube-api-access-kzbpk\") pod \"controller-7bb4cc7c98-d7bq6\" (UID: \"b50e6e0c-7ab7-4516-929a-49e48e00c1e2\") " pod="metallb-system/controller-7bb4cc7c98-d7bq6" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.244731 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.273931 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p9db" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.544388 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p9db"] Mar 20 07:28:46 crc kubenswrapper[4749]: W0320 07:28:46.548668 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb235c243_14bb_4dba_9905_5bd230ae2879.slice/crio-e420ad6ff54b9421a9e8692eae3717c44aeaf7db87af3aa9d131d4ad9e4dec33 WatchSource:0}: Error finding container e420ad6ff54b9421a9e8692eae3717c44aeaf7db87af3aa9d131d4ad9e4dec33: Status 404 returned error can't find the container with id e420ad6ff54b9421a9e8692eae3717c44aeaf7db87af3aa9d131d4ad9e4dec33 Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.682115 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae069ce2-c4d3-434f-8c66-95a75561bf8b-metrics-certs\") pod \"speaker-s5f42\" (UID: \"ae069ce2-c4d3-434f-8c66-95a75561bf8b\") " pod="metallb-system/speaker-s5f42" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.682191 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ae069ce2-c4d3-434f-8c66-95a75561bf8b-memberlist\") pod \"speaker-s5f42\" (UID: \"ae069ce2-c4d3-434f-8c66-95a75561bf8b\") " pod="metallb-system/speaker-s5f42" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.682238 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b50e6e0c-7ab7-4516-929a-49e48e00c1e2-metrics-certs\") pod \"controller-7bb4cc7c98-d7bq6\" (UID: \"b50e6e0c-7ab7-4516-929a-49e48e00c1e2\") " pod="metallb-system/controller-7bb4cc7c98-d7bq6" Mar 20 07:28:46 crc kubenswrapper[4749]: E0320 07:28:46.682410 4749 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 07:28:46 crc kubenswrapper[4749]: E0320 07:28:46.682463 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae069ce2-c4d3-434f-8c66-95a75561bf8b-memberlist podName:ae069ce2-c4d3-434f-8c66-95a75561bf8b nodeName:}" failed. No retries permitted until 2026-03-20 07:28:47.682449476 +0000 UTC m=+964.232107123 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ae069ce2-c4d3-434f-8c66-95a75561bf8b-memberlist") pod "speaker-s5f42" (UID: "ae069ce2-c4d3-434f-8c66-95a75561bf8b") : secret "metallb-memberlist" not found Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.686983 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b50e6e0c-7ab7-4516-929a-49e48e00c1e2-metrics-certs\") pod \"controller-7bb4cc7c98-d7bq6\" (UID: \"b50e6e0c-7ab7-4516-929a-49e48e00c1e2\") " pod="metallb-system/controller-7bb4cc7c98-d7bq6" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.688707 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae069ce2-c4d3-434f-8c66-95a75561bf8b-metrics-certs\") pod \"speaker-s5f42\" (UID: \"ae069ce2-c4d3-434f-8c66-95a75561bf8b\") " pod="metallb-system/speaker-s5f42" Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.742240 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ww9w" event={"ID":"82ef576a-cd53-4b86-8ddc-e2528fa1b23d","Type":"ContainerStarted","Data":"cbe10e62e97055b9eecabb5eadaa12a42cabb6f717dc2c0ea934f0787478339d"} Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.744024 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p9db" event={"ID":"b235c243-14bb-4dba-9905-5bd230ae2879","Type":"ContainerStarted","Data":"e420ad6ff54b9421a9e8692eae3717c44aeaf7db87af3aa9d131d4ad9e4dec33"} Mar 20 07:28:46 crc kubenswrapper[4749]: I0320 07:28:46.956649 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-d7bq6" Mar 20 07:28:47 crc kubenswrapper[4749]: I0320 07:28:47.441586 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-d7bq6"] Mar 20 07:28:47 crc kubenswrapper[4749]: I0320 07:28:47.699678 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ae069ce2-c4d3-434f-8c66-95a75561bf8b-memberlist\") pod \"speaker-s5f42\" (UID: \"ae069ce2-c4d3-434f-8c66-95a75561bf8b\") " pod="metallb-system/speaker-s5f42" Mar 20 07:28:47 crc kubenswrapper[4749]: I0320 07:28:47.706018 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ae069ce2-c4d3-434f-8c66-95a75561bf8b-memberlist\") pod \"speaker-s5f42\" (UID: \"ae069ce2-c4d3-434f-8c66-95a75561bf8b\") " pod="metallb-system/speaker-s5f42" Mar 20 07:28:47 crc kubenswrapper[4749]: I0320 07:28:47.752865 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-d7bq6" event={"ID":"b50e6e0c-7ab7-4516-929a-49e48e00c1e2","Type":"ContainerStarted","Data":"79395402c34b7bb4bde902cceee4c9ad3db14c61fd8f66aaef2667862b9fd437"} Mar 20 07:28:47 crc kubenswrapper[4749]: I0320 07:28:47.752911 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-d7bq6" event={"ID":"b50e6e0c-7ab7-4516-929a-49e48e00c1e2","Type":"ContainerStarted","Data":"dedd93eef8bd31358f45ddede26125dd1b835704f126c1d33ea1b560843d99e8"} Mar 20 07:28:47 crc kubenswrapper[4749]: I0320 07:28:47.837063 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-s5f42" Mar 20 07:28:47 crc kubenswrapper[4749]: W0320 07:28:47.866904 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae069ce2_c4d3_434f_8c66_95a75561bf8b.slice/crio-29f7dbf5f846ad6309a8e4a0be35a631018a3908a916cdd74d105bb30b5e2c0e WatchSource:0}: Error finding container 29f7dbf5f846ad6309a8e4a0be35a631018a3908a916cdd74d105bb30b5e2c0e: Status 404 returned error can't find the container with id 29f7dbf5f846ad6309a8e4a0be35a631018a3908a916cdd74d105bb30b5e2c0e Mar 20 07:28:48 crc kubenswrapper[4749]: I0320 07:28:48.769715 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-d7bq6" event={"ID":"b50e6e0c-7ab7-4516-929a-49e48e00c1e2","Type":"ContainerStarted","Data":"a95e3f18e5a149e4272bb6db4f02017053628e1f2b79efe30b9fb8bb5d001e41"} Mar 20 07:28:48 crc kubenswrapper[4749]: I0320 07:28:48.770976 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-d7bq6" Mar 20 07:28:48 crc kubenswrapper[4749]: I0320 07:28:48.773044 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-s5f42" event={"ID":"ae069ce2-c4d3-434f-8c66-95a75561bf8b","Type":"ContainerStarted","Data":"87d336321a3a008c60d71864abeb53af9b8d40d2a9d11452e3ad5118850f1b3d"} Mar 20 07:28:48 crc kubenswrapper[4749]: I0320 07:28:48.773088 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-s5f42" event={"ID":"ae069ce2-c4d3-434f-8c66-95a75561bf8b","Type":"ContainerStarted","Data":"8f91a1cc16281651f68e68d5d1bda6d78f466b88bc1f245bef7b73589b697e3a"} Mar 20 07:28:48 crc kubenswrapper[4749]: I0320 07:28:48.773106 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-s5f42" event={"ID":"ae069ce2-c4d3-434f-8c66-95a75561bf8b","Type":"ContainerStarted","Data":"29f7dbf5f846ad6309a8e4a0be35a631018a3908a916cdd74d105bb30b5e2c0e"} Mar 20 07:28:48 crc kubenswrapper[4749]: I0320 07:28:48.773564 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-s5f42" Mar 20 07:28:48 crc kubenswrapper[4749]: I0320 07:28:48.786135 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-d7bq6" podStartSLOduration=2.78611826 podStartE2EDuration="2.78611826s" podCreationTimestamp="2026-03-20 07:28:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:28:48.78573222 +0000 UTC m=+965.335389867" watchObservedRunningTime="2026-03-20 07:28:48.78611826 +0000 UTC m=+965.335775907" Mar 20 07:28:48 crc kubenswrapper[4749]: I0320 07:28:48.809447 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-s5f42" podStartSLOduration=2.809421882 podStartE2EDuration="2.809421882s" podCreationTimestamp="2026-03-20 07:28:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:28:48.798804516 +0000 UTC m=+965.348462153" watchObservedRunningTime="2026-03-20 07:28:48.809421882 +0000 UTC m=+965.359079549" Mar 20 07:28:52 crc kubenswrapper[4749]: I0320 07:28:52.245232 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lhvxm"] Mar 20 07:28:52 crc kubenswrapper[4749]: I0320 07:28:52.248365 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lhvxm" Mar 20 07:28:52 crc kubenswrapper[4749]: I0320 07:28:52.275018 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lhvxm"] Mar 20 07:28:52 crc kubenswrapper[4749]: I0320 07:28:52.301252 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31055e7f-09ed-4429-a033-18a15794dd9c-utilities\") pod \"certified-operators-lhvxm\" (UID: \"31055e7f-09ed-4429-a033-18a15794dd9c\") " pod="openshift-marketplace/certified-operators-lhvxm" Mar 20 07:28:52 crc kubenswrapper[4749]: I0320 07:28:52.301370 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31055e7f-09ed-4429-a033-18a15794dd9c-catalog-content\") pod \"certified-operators-lhvxm\" (UID: \"31055e7f-09ed-4429-a033-18a15794dd9c\") " pod="openshift-marketplace/certified-operators-lhvxm" Mar 20 07:28:52 crc kubenswrapper[4749]: I0320 07:28:52.301411 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk5nx\" (UniqueName: \"kubernetes.io/projected/31055e7f-09ed-4429-a033-18a15794dd9c-kube-api-access-zk5nx\") pod \"certified-operators-lhvxm\" (UID: \"31055e7f-09ed-4429-a033-18a15794dd9c\") " pod="openshift-marketplace/certified-operators-lhvxm" Mar 20 07:28:52 crc kubenswrapper[4749]: I0320 07:28:52.402362 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31055e7f-09ed-4429-a033-18a15794dd9c-utilities\") pod \"certified-operators-lhvxm\" (UID: \"31055e7f-09ed-4429-a033-18a15794dd9c\") " pod="openshift-marketplace/certified-operators-lhvxm" Mar 20 07:28:52 crc kubenswrapper[4749]: I0320 07:28:52.402446 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31055e7f-09ed-4429-a033-18a15794dd9c-catalog-content\") pod \"certified-operators-lhvxm\" (UID: \"31055e7f-09ed-4429-a033-18a15794dd9c\") " pod="openshift-marketplace/certified-operators-lhvxm" Mar 20 07:28:52 crc kubenswrapper[4749]: I0320 07:28:52.402473 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk5nx\" (UniqueName: \"kubernetes.io/projected/31055e7f-09ed-4429-a033-18a15794dd9c-kube-api-access-zk5nx\") pod \"certified-operators-lhvxm\" (UID: \"31055e7f-09ed-4429-a033-18a15794dd9c\") " pod="openshift-marketplace/certified-operators-lhvxm" Mar 20 07:28:52 crc kubenswrapper[4749]: I0320 07:28:52.402824 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31055e7f-09ed-4429-a033-18a15794dd9c-utilities\") pod \"certified-operators-lhvxm\" (UID: \"31055e7f-09ed-4429-a033-18a15794dd9c\") " pod="openshift-marketplace/certified-operators-lhvxm" Mar 20 07:28:52 crc kubenswrapper[4749]: I0320 07:28:52.403020 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31055e7f-09ed-4429-a033-18a15794dd9c-catalog-content\") pod \"certified-operators-lhvxm\" (UID: \"31055e7f-09ed-4429-a033-18a15794dd9c\") " pod="openshift-marketplace/certified-operators-lhvxm" Mar 20 07:28:52 crc kubenswrapper[4749]: I0320 07:28:52.431099 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk5nx\" (UniqueName: \"kubernetes.io/projected/31055e7f-09ed-4429-a033-18a15794dd9c-kube-api-access-zk5nx\") pod \"certified-operators-lhvxm\" (UID: \"31055e7f-09ed-4429-a033-18a15794dd9c\") " pod="openshift-marketplace/certified-operators-lhvxm" Mar 20 07:28:52 crc kubenswrapper[4749]: I0320 07:28:52.582105 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lhvxm" Mar 20 07:28:53 crc kubenswrapper[4749]: I0320 07:28:52.998030 4749 scope.go:117] "RemoveContainer" containerID="ce9797d774005ed22ac91896d8e0594d834d974f75b6894990b7f17919a4a2db" Mar 20 07:28:54 crc kubenswrapper[4749]: I0320 07:28:54.550005 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lhvxm"] Mar 20 07:28:54 crc kubenswrapper[4749]: W0320 07:28:54.557981 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31055e7f_09ed_4429_a033_18a15794dd9c.slice/crio-8538c10f521b643543ed0738fa6b08cd922966afbb8acad2f6bc3a0a3f610aea WatchSource:0}: Error finding container 8538c10f521b643543ed0738fa6b08cd922966afbb8acad2f6bc3a0a3f610aea: Status 404 returned error can't find the container with id 8538c10f521b643543ed0738fa6b08cd922966afbb8acad2f6bc3a0a3f610aea Mar 20 07:28:54 crc kubenswrapper[4749]: I0320 07:28:54.818181 4749 generic.go:334] "Generic (PLEG): container finished" podID="31055e7f-09ed-4429-a033-18a15794dd9c" containerID="fd216e548d7138ef3e9e9ecd959b7688e7950c52f2007e4dae0f73f123ad684e" exitCode=0 Mar 20 07:28:54 crc kubenswrapper[4749]: I0320 07:28:54.818813 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhvxm" event={"ID":"31055e7f-09ed-4429-a033-18a15794dd9c","Type":"ContainerDied","Data":"fd216e548d7138ef3e9e9ecd959b7688e7950c52f2007e4dae0f73f123ad684e"} Mar 20 07:28:54 crc kubenswrapper[4749]: I0320 07:28:54.819014 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhvxm" event={"ID":"31055e7f-09ed-4429-a033-18a15794dd9c","Type":"ContainerStarted","Data":"8538c10f521b643543ed0738fa6b08cd922966afbb8acad2f6bc3a0a3f610aea"} Mar 20 07:28:54 crc kubenswrapper[4749]: I0320 07:28:54.828827 4749 generic.go:334] "Generic (PLEG): container finished" podID="82ef576a-cd53-4b86-8ddc-e2528fa1b23d" containerID="a3f0c9f5617fcb1037fa2c9bfcb78746f0d06dbd5d609a3209520f986294ccaf" exitCode=0 Mar 20 07:28:54 crc kubenswrapper[4749]: I0320 07:28:54.828906 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ww9w" event={"ID":"82ef576a-cd53-4b86-8ddc-e2528fa1b23d","Type":"ContainerDied","Data":"a3f0c9f5617fcb1037fa2c9bfcb78746f0d06dbd5d609a3209520f986294ccaf"} Mar 20 07:28:54 crc kubenswrapper[4749]: I0320 07:28:54.830718 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p9db" event={"ID":"b235c243-14bb-4dba-9905-5bd230ae2879","Type":"ContainerStarted","Data":"8665cdcf972d5a588821ab75a54b4b41c744d98cfe340dbff2a92adc66408f82"} Mar 20 07:28:54 crc kubenswrapper[4749]: I0320 07:28:54.830949 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p9db" Mar 20 07:28:54 crc kubenswrapper[4749]: I0320 07:28:54.895503 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p9db" podStartSLOduration=2.299015629 podStartE2EDuration="9.895481658s" podCreationTimestamp="2026-03-20 07:28:45 +0000 UTC" firstStartedPulling="2026-03-20 07:28:46.550724715 +0000 UTC m=+963.100382362" lastFinishedPulling="2026-03-20 07:28:54.147190744 +0000 UTC m=+970.696848391" observedRunningTime="2026-03-20 07:28:54.891908232 +0000 UTC m=+971.441565889" watchObservedRunningTime="2026-03-20 07:28:54.895481658 +0000 UTC m=+971.445139315" Mar 20 07:28:55 crc kubenswrapper[4749]: I0320 07:28:55.843563 4749 generic.go:334] "Generic (PLEG): container finished" podID="82ef576a-cd53-4b86-8ddc-e2528fa1b23d" containerID="4d32b6b57c186d689e5ba9bba76a8333ca1cae2695747d44a1a9a7100ab2b1bb" exitCode=0 Mar 20 07:28:55 crc kubenswrapper[4749]: I0320 07:28:55.844148 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ww9w" event={"ID":"82ef576a-cd53-4b86-8ddc-e2528fa1b23d","Type":"ContainerDied","Data":"4d32b6b57c186d689e5ba9bba76a8333ca1cae2695747d44a1a9a7100ab2b1bb"} Mar 20 07:28:56 crc kubenswrapper[4749]: I0320 07:28:56.869138 4749 generic.go:334] "Generic (PLEG): container finished" podID="82ef576a-cd53-4b86-8ddc-e2528fa1b23d" containerID="750f7878b132ac342e44bab16539972af496a0e12b119260d40654bde2b1c3fc" exitCode=0 Mar 20 07:28:56 crc kubenswrapper[4749]: I0320 07:28:56.869198 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ww9w" event={"ID":"82ef576a-cd53-4b86-8ddc-e2528fa1b23d","Type":"ContainerDied","Data":"750f7878b132ac342e44bab16539972af496a0e12b119260d40654bde2b1c3fc"} Mar 20 07:28:56 crc kubenswrapper[4749]: I0320 07:28:56.874344 4749 generic.go:334] "Generic (PLEG): container finished" podID="31055e7f-09ed-4429-a033-18a15794dd9c" containerID="5b40e532afaf6c99444f304bb237ce1ff7465ed9e4612217a14795f23118703d" exitCode=0 Mar 20 07:28:56 crc kubenswrapper[4749]: I0320 07:28:56.874463 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhvxm" event={"ID":"31055e7f-09ed-4429-a033-18a15794dd9c","Type":"ContainerDied","Data":"5b40e532afaf6c99444f304bb237ce1ff7465ed9e4612217a14795f23118703d"} Mar 20 07:28:57 crc kubenswrapper[4749]: I0320 07:28:57.892976 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhvxm" event={"ID":"31055e7f-09ed-4429-a033-18a15794dd9c","Type":"ContainerStarted","Data":"b92ba90002bb93618b20b5ceb94f5d2c1115f3fc31db6e3c1dfdd2842233b86e"} Mar 20 07:28:57 crc kubenswrapper[4749]: I0320 07:28:57.898352 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ww9w" event={"ID":"82ef576a-cd53-4b86-8ddc-e2528fa1b23d","Type":"ContainerStarted","Data":"b5b626e37fcdf6eb36371f4732b176e0ba1d229b0780102f82d2902afa6f0467"} Mar 20 07:28:57 crc kubenswrapper[4749]: I0320 07:28:57.898433 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ww9w" event={"ID":"82ef576a-cd53-4b86-8ddc-e2528fa1b23d","Type":"ContainerStarted","Data":"94860caea5e23303e3908f6e6b7f5542d2d90389d33e380fc6de2946b14afdd2"} Mar 20 07:28:57 crc kubenswrapper[4749]: I0320 07:28:57.898531 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ww9w" event={"ID":"82ef576a-cd53-4b86-8ddc-e2528fa1b23d","Type":"ContainerStarted","Data":"90662ab32751e2b9c7d116b944f5fb422c97fcd237a46a02a1a5ae09af034f16"} Mar 20 07:28:57 crc kubenswrapper[4749]: I0320 07:28:57.898550 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ww9w" event={"ID":"82ef576a-cd53-4b86-8ddc-e2528fa1b23d","Type":"ContainerStarted","Data":"17b1a7dd80e9b9ecde7853c07be67674536079518a79096ec6e601153371d8d7"} Mar 20 07:28:57 crc kubenswrapper[4749]: I0320 07:28:57.898566 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ww9w" event={"ID":"82ef576a-cd53-4b86-8ddc-e2528fa1b23d","Type":"ContainerStarted","Data":"af1a239c89738148ec2efcbf31134cfe3167e0da5b7e8f6ad5814fb4814cce01"} Mar 20 07:28:57 crc kubenswrapper[4749]: I0320 07:28:57.920507 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lhvxm" podStartSLOduration=3.401417165 podStartE2EDuration="5.920473568s" podCreationTimestamp="2026-03-20 07:28:52 +0000 UTC" firstStartedPulling="2026-03-20 07:28:54.821322637 +0000 UTC m=+971.370980294" lastFinishedPulling="2026-03-20 07:28:57.34037905 +0000 UTC m=+973.890036697" observedRunningTime="2026-03-20 07:28:57.912581606 +0000 UTC m=+974.462239283" watchObservedRunningTime="2026-03-20 07:28:57.920473568 +0000 UTC m=+974.470131305" Mar 20 07:28:58 crc kubenswrapper[4749]: I0320 07:28:58.910635 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ww9w" event={"ID":"82ef576a-cd53-4b86-8ddc-e2528fa1b23d","Type":"ContainerStarted","Data":"30cc62d2a8d8eef84f1a722b58432a231ff9309b4131817a8ef36630de4df2f6"} Mar 20 07:28:58 crc kubenswrapper[4749]: I0320 07:28:58.951902 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-5ww9w" podStartSLOduration=6.127530759 podStartE2EDuration="13.951874521s" podCreationTimestamp="2026-03-20 07:28:45 +0000 UTC" firstStartedPulling="2026-03-20 07:28:46.382168464 +0000 UTC m=+962.931826111" lastFinishedPulling="2026-03-20 07:28:54.206512186 +0000 UTC m=+970.756169873" observedRunningTime="2026-03-20 07:28:58.947405973 +0000 UTC m=+975.497063660" watchObservedRunningTime="2026-03-20 07:28:58.951874521 +0000 UTC m=+975.501532188" Mar 20 07:28:59 crc kubenswrapper[4749]: I0320 07:28:59.919673 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:29:01 crc kubenswrapper[4749]: I0320 07:29:01.246041 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:29:01 crc kubenswrapper[4749]: I0320 07:29:01.313502 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:29:02 crc kubenswrapper[4749]: I0320 07:29:02.582549 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lhvxm" Mar 20 07:29:02 crc kubenswrapper[4749]: I0320 07:29:02.582876 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lhvxm" Mar 20 07:29:02 crc kubenswrapper[4749]: I0320 07:29:02.639458 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lhvxm" Mar 20 07:29:03 crc kubenswrapper[4749]: I0320 07:29:03.006635 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lhvxm" Mar 20 07:29:03 crc kubenswrapper[4749]: I0320 07:29:03.074254 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lhvxm"] Mar 20 07:29:04 crc kubenswrapper[4749]: I0320 07:29:04.515024 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:29:04 crc kubenswrapper[4749]: I0320 07:29:04.515115 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:29:04 crc kubenswrapper[4749]: I0320 07:29:04.957149 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lhvxm" podUID="31055e7f-09ed-4429-a033-18a15794dd9c" containerName="registry-server" containerID="cri-o://b92ba90002bb93618b20b5ceb94f5d2c1115f3fc31db6e3c1dfdd2842233b86e" gracePeriod=2 Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.286322 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qcw9v"] Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.288720 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qcw9v" Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.307805 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qcw9v"] Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.377974 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2031d0a6-251a-4d48-b42a-25d55600ddb8-utilities\") pod \"community-operators-qcw9v\" (UID: \"2031d0a6-251a-4d48-b42a-25d55600ddb8\") " pod="openshift-marketplace/community-operators-qcw9v" Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.378083 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2031d0a6-251a-4d48-b42a-25d55600ddb8-catalog-content\") pod \"community-operators-qcw9v\" (UID: \"2031d0a6-251a-4d48-b42a-25d55600ddb8\") " pod="openshift-marketplace/community-operators-qcw9v" Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.378211 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bktmb\" (UniqueName: \"kubernetes.io/projected/2031d0a6-251a-4d48-b42a-25d55600ddb8-kube-api-access-bktmb\") pod \"community-operators-qcw9v\" (UID: \"2031d0a6-251a-4d48-b42a-25d55600ddb8\") " pod="openshift-marketplace/community-operators-qcw9v" Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.478418 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lhvxm" Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.479034 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2031d0a6-251a-4d48-b42a-25d55600ddb8-utilities\") pod \"community-operators-qcw9v\" (UID: \"2031d0a6-251a-4d48-b42a-25d55600ddb8\") " pod="openshift-marketplace/community-operators-qcw9v" Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.479110 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2031d0a6-251a-4d48-b42a-25d55600ddb8-catalog-content\") pod \"community-operators-qcw9v\" (UID: \"2031d0a6-251a-4d48-b42a-25d55600ddb8\") " pod="openshift-marketplace/community-operators-qcw9v" Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.479139 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bktmb\" (UniqueName: \"kubernetes.io/projected/2031d0a6-251a-4d48-b42a-25d55600ddb8-kube-api-access-bktmb\") pod \"community-operators-qcw9v\" (UID: \"2031d0a6-251a-4d48-b42a-25d55600ddb8\") " pod="openshift-marketplace/community-operators-qcw9v" Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.479524 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2031d0a6-251a-4d48-b42a-25d55600ddb8-utilities\") pod \"community-operators-qcw9v\" (UID: \"2031d0a6-251a-4d48-b42a-25d55600ddb8\") " pod="openshift-marketplace/community-operators-qcw9v" Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.479658 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2031d0a6-251a-4d48-b42a-25d55600ddb8-catalog-content\") pod \"community-operators-qcw9v\" (UID: \"2031d0a6-251a-4d48-b42a-25d55600ddb8\") " pod="openshift-marketplace/community-operators-qcw9v" Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.499311 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bktmb\" (UniqueName: \"kubernetes.io/projected/2031d0a6-251a-4d48-b42a-25d55600ddb8-kube-api-access-bktmb\") pod \"community-operators-qcw9v\" (UID: \"2031d0a6-251a-4d48-b42a-25d55600ddb8\") " pod="openshift-marketplace/community-operators-qcw9v" Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.579773 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31055e7f-09ed-4429-a033-18a15794dd9c-catalog-content\") pod \"31055e7f-09ed-4429-a033-18a15794dd9c\" (UID: \"31055e7f-09ed-4429-a033-18a15794dd9c\") " Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.580021 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk5nx\" (UniqueName: \"kubernetes.io/projected/31055e7f-09ed-4429-a033-18a15794dd9c-kube-api-access-zk5nx\") pod \"31055e7f-09ed-4429-a033-18a15794dd9c\" (UID: \"31055e7f-09ed-4429-a033-18a15794dd9c\") " Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.580067 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31055e7f-09ed-4429-a033-18a15794dd9c-utilities\") pod \"31055e7f-09ed-4429-a033-18a15794dd9c\" (UID: \"31055e7f-09ed-4429-a033-18a15794dd9c\") " Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.580887 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31055e7f-09ed-4429-a033-18a15794dd9c-utilities" (OuterVolumeSpecName: "utilities") pod "31055e7f-09ed-4429-a033-18a15794dd9c" (UID: "31055e7f-09ed-4429-a033-18a15794dd9c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.594465 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31055e7f-09ed-4429-a033-18a15794dd9c-kube-api-access-zk5nx" (OuterVolumeSpecName: "kube-api-access-zk5nx") pod "31055e7f-09ed-4429-a033-18a15794dd9c" (UID: "31055e7f-09ed-4429-a033-18a15794dd9c"). InnerVolumeSpecName "kube-api-access-zk5nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.630103 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qcw9v" Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.651661 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31055e7f-09ed-4429-a033-18a15794dd9c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31055e7f-09ed-4429-a033-18a15794dd9c" (UID: "31055e7f-09ed-4429-a033-18a15794dd9c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.680896 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31055e7f-09ed-4429-a033-18a15794dd9c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.680934 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk5nx\" (UniqueName: \"kubernetes.io/projected/31055e7f-09ed-4429-a033-18a15794dd9c-kube-api-access-zk5nx\") on node \"crc\" DevicePath \"\"" Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.680945 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31055e7f-09ed-4429-a033-18a15794dd9c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.869380 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qcw9v"] Mar 20 07:29:05 crc kubenswrapper[4749]: W0320 07:29:05.873894 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2031d0a6_251a_4d48_b42a_25d55600ddb8.slice/crio-0dfbee909c28036beb859144714528aa6fbbdbaf427cdb8d789f9ef371e4ba6f WatchSource:0}: Error finding container 0dfbee909c28036beb859144714528aa6fbbdbaf427cdb8d789f9ef371e4ba6f: Status 404 returned error can't find the container with id 0dfbee909c28036beb859144714528aa6fbbdbaf427cdb8d789f9ef371e4ba6f Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.964888 4749 generic.go:334] "Generic (PLEG): container finished" podID="31055e7f-09ed-4429-a033-18a15794dd9c" containerID="b92ba90002bb93618b20b5ceb94f5d2c1115f3fc31db6e3c1dfdd2842233b86e" exitCode=0 Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.965161 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lhvxm" Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.965072 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhvxm" event={"ID":"31055e7f-09ed-4429-a033-18a15794dd9c","Type":"ContainerDied","Data":"b92ba90002bb93618b20b5ceb94f5d2c1115f3fc31db6e3c1dfdd2842233b86e"} Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.965214 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhvxm" event={"ID":"31055e7f-09ed-4429-a033-18a15794dd9c","Type":"ContainerDied","Data":"8538c10f521b643543ed0738fa6b08cd922966afbb8acad2f6bc3a0a3f610aea"} Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.965237 4749 scope.go:117] "RemoveContainer" containerID="b92ba90002bb93618b20b5ceb94f5d2c1115f3fc31db6e3c1dfdd2842233b86e" Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.967469 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcw9v" event={"ID":"2031d0a6-251a-4d48-b42a-25d55600ddb8","Type":"ContainerStarted","Data":"0dfbee909c28036beb859144714528aa6fbbdbaf427cdb8d789f9ef371e4ba6f"} Mar 20 07:29:05 crc kubenswrapper[4749]: I0320 07:29:05.980600 4749 scope.go:117] "RemoveContainer" containerID="5b40e532afaf6c99444f304bb237ce1ff7465ed9e4612217a14795f23118703d" Mar 20 07:29:06 crc kubenswrapper[4749]: I0320 07:29:06.003079 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lhvxm"] Mar 20 07:29:06 crc kubenswrapper[4749]: I0320 07:29:06.005015 4749 scope.go:117] "RemoveContainer" containerID="fd216e548d7138ef3e9e9ecd959b7688e7950c52f2007e4dae0f73f123ad684e" Mar 20 07:29:06 crc kubenswrapper[4749]: I0320 07:29:06.011000 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lhvxm"] Mar 20 07:29:06 crc kubenswrapper[4749]: I0320 07:29:06.019188 4749 scope.go:117] "RemoveContainer" containerID="b92ba90002bb93618b20b5ceb94f5d2c1115f3fc31db6e3c1dfdd2842233b86e" Mar 20 07:29:06 crc kubenswrapper[4749]: E0320 07:29:06.019594 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b92ba90002bb93618b20b5ceb94f5d2c1115f3fc31db6e3c1dfdd2842233b86e\": container with ID starting with b92ba90002bb93618b20b5ceb94f5d2c1115f3fc31db6e3c1dfdd2842233b86e not found: ID does not exist" containerID="b92ba90002bb93618b20b5ceb94f5d2c1115f3fc31db6e3c1dfdd2842233b86e" Mar 20 07:29:06 crc kubenswrapper[4749]: I0320 07:29:06.019629 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b92ba90002bb93618b20b5ceb94f5d2c1115f3fc31db6e3c1dfdd2842233b86e"} err="failed to get container status \"b92ba90002bb93618b20b5ceb94f5d2c1115f3fc31db6e3c1dfdd2842233b86e\": rpc error: code = NotFound desc = could not find container \"b92ba90002bb93618b20b5ceb94f5d2c1115f3fc31db6e3c1dfdd2842233b86e\": container with ID starting with b92ba90002bb93618b20b5ceb94f5d2c1115f3fc31db6e3c1dfdd2842233b86e not found: ID does not exist" Mar 20 07:29:06 crc kubenswrapper[4749]: I0320 07:29:06.019652 4749 scope.go:117] "RemoveContainer" containerID="5b40e532afaf6c99444f304bb237ce1ff7465ed9e4612217a14795f23118703d" Mar 20 07:29:06 crc kubenswrapper[4749]: E0320 07:29:06.019955 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b40e532afaf6c99444f304bb237ce1ff7465ed9e4612217a14795f23118703d\": container with ID starting with 5b40e532afaf6c99444f304bb237ce1ff7465ed9e4612217a14795f23118703d not found: ID does not exist" containerID="5b40e532afaf6c99444f304bb237ce1ff7465ed9e4612217a14795f23118703d" Mar 20 07:29:06 crc kubenswrapper[4749]: I0320 07:29:06.019983 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b40e532afaf6c99444f304bb237ce1ff7465ed9e4612217a14795f23118703d"} err="failed to get container status \"5b40e532afaf6c99444f304bb237ce1ff7465ed9e4612217a14795f23118703d\": rpc error: code = NotFound desc = could not find container \"5b40e532afaf6c99444f304bb237ce1ff7465ed9e4612217a14795f23118703d\": container with ID starting with 5b40e532afaf6c99444f304bb237ce1ff7465ed9e4612217a14795f23118703d not found: ID does not exist" Mar 20 07:29:06 crc kubenswrapper[4749]: I0320 07:29:06.020000 4749 scope.go:117] "RemoveContainer" containerID="fd216e548d7138ef3e9e9ecd959b7688e7950c52f2007e4dae0f73f123ad684e" Mar 20 07:29:06 crc kubenswrapper[4749]: E0320 07:29:06.020213 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd216e548d7138ef3e9e9ecd959b7688e7950c52f2007e4dae0f73f123ad684e\": container with ID starting with fd216e548d7138ef3e9e9ecd959b7688e7950c52f2007e4dae0f73f123ad684e not found: ID does not exist" containerID="fd216e548d7138ef3e9e9ecd959b7688e7950c52f2007e4dae0f73f123ad684e" Mar 20 07:29:06 crc kubenswrapper[4749]: I0320 07:29:06.020248 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd216e548d7138ef3e9e9ecd959b7688e7950c52f2007e4dae0f73f123ad684e"} err="failed to get container status \"fd216e548d7138ef3e9e9ecd959b7688e7950c52f2007e4dae0f73f123ad684e\": rpc error: code = NotFound desc = could not find container \"fd216e548d7138ef3e9e9ecd959b7688e7950c52f2007e4dae0f73f123ad684e\": container with ID starting with fd216e548d7138ef3e9e9ecd959b7688e7950c52f2007e4dae0f73f123ad684e not found: ID does not exist" Mar 20 07:29:06 crc kubenswrapper[4749]: I0320 07:29:06.191790 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31055e7f-09ed-4429-a033-18a15794dd9c" path="/var/lib/kubelet/pods/31055e7f-09ed-4429-a033-18a15794dd9c/volumes" Mar 20 07:29:06 crc kubenswrapper[4749]: I0320 07:29:06.294072 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p9db" Mar 20 07:29:06 crc kubenswrapper[4749]: I0320 07:29:06.961476 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-d7bq6" Mar 20 07:29:06 crc kubenswrapper[4749]: I0320 07:29:06.979581 4749 generic.go:334] "Generic (PLEG): container finished" podID="2031d0a6-251a-4d48-b42a-25d55600ddb8" containerID="f39923698ca770dadaef7ab8ae5e9001ddba5f6720b8ed6257641275efaf54bb" exitCode=0 Mar 20 07:29:06 crc kubenswrapper[4749]: I0320 07:29:06.979619 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcw9v" event={"ID":"2031d0a6-251a-4d48-b42a-25d55600ddb8","Type":"ContainerDied","Data":"f39923698ca770dadaef7ab8ae5e9001ddba5f6720b8ed6257641275efaf54bb"} Mar 20 07:29:07 crc kubenswrapper[4749]: I0320 07:29:07.842435 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-s5f42" Mar 20 07:29:09 crc kubenswrapper[4749]: I0320 07:29:09.003534 4749 generic.go:334] "Generic (PLEG): container finished" podID="2031d0a6-251a-4d48-b42a-25d55600ddb8" containerID="baa55d26664e90a9937436e405ae78410b11df5c4281460d963317882fdcaff2" exitCode=0 Mar 20 07:29:09 crc kubenswrapper[4749]: I0320 07:29:09.003614 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcw9v" event={"ID":"2031d0a6-251a-4d48-b42a-25d55600ddb8","Type":"ContainerDied","Data":"baa55d26664e90a9937436e405ae78410b11df5c4281460d963317882fdcaff2"} Mar 20 07:29:10 crc kubenswrapper[4749]: I0320 07:29:10.014007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcw9v" event={"ID":"2031d0a6-251a-4d48-b42a-25d55600ddb8","Type":"ContainerStarted","Data":"4f85c24eb6e1da2124e7415598871873481c1619828a0358f4b0ed9d872f6802"} Mar 20 07:29:10 crc kubenswrapper[4749]: I0320 07:29:10.042386 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qcw9v" podStartSLOduration=2.533227608 podStartE2EDuration="5.042369909s" podCreationTimestamp="2026-03-20 07:29:05 +0000 UTC" firstStartedPulling="2026-03-20 07:29:06.981329156 +0000 UTC m=+983.530986813" lastFinishedPulling="2026-03-20 07:29:09.490471467 +0000 UTC m=+986.040129114" observedRunningTime="2026-03-20 07:29:10.038158977 +0000 UTC m=+986.587816634" watchObservedRunningTime="2026-03-20 07:29:10.042369909 +0000 UTC m=+986.592027556" Mar 20 07:29:13 crc kubenswrapper[4749]: I0320 07:29:13.889263 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-xr8zb"] Mar 20 07:29:13 crc kubenswrapper[4749]: E0320 07:29:13.890100 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31055e7f-09ed-4429-a033-18a15794dd9c" containerName="registry-server" Mar 20 07:29:13 crc kubenswrapper[4749]: I0320 07:29:13.890122 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="31055e7f-09ed-4429-a033-18a15794dd9c" containerName="registry-server" Mar 20 07:29:13 crc kubenswrapper[4749]: E0320 07:29:13.890155 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31055e7f-09ed-4429-a033-18a15794dd9c" containerName="extract-utilities" Mar 20 07:29:13 crc kubenswrapper[4749]: I0320 07:29:13.890174 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="31055e7f-09ed-4429-a033-18a15794dd9c" containerName="extract-utilities" Mar 20 07:29:13 crc kubenswrapper[4749]: E0320 07:29:13.890212 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31055e7f-09ed-4429-a033-18a15794dd9c" containerName="extract-content" Mar 20 07:29:13 crc kubenswrapper[4749]: I0320 07:29:13.890230 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="31055e7f-09ed-4429-a033-18a15794dd9c" containerName="extract-content" Mar 20 07:29:13 crc kubenswrapper[4749]: I0320 07:29:13.890504 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="31055e7f-09ed-4429-a033-18a15794dd9c" containerName="registry-server" Mar 20 07:29:13 crc kubenswrapper[4749]: I0320 07:29:13.891174 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xr8zb" Mar 20 07:29:13 crc kubenswrapper[4749]: I0320 07:29:13.894912 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-s4428" Mar 20 07:29:13 crc kubenswrapper[4749]: I0320 07:29:13.898171 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 07:29:13 crc kubenswrapper[4749]: I0320 07:29:13.898591 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 07:29:13 crc kubenswrapper[4749]: I0320 07:29:13.904898 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xr8zb"] Mar 20 07:29:13 crc kubenswrapper[4749]: I0320 07:29:13.997096 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6djj\" (UniqueName: \"kubernetes.io/projected/1d4662da-8993-4e3e-a71f-482be0e247a1-kube-api-access-j6djj\") pod \"openstack-operator-index-xr8zb\" (UID: \"1d4662da-8993-4e3e-a71f-482be0e247a1\") " pod="openstack-operators/openstack-operator-index-xr8zb" Mar 20 07:29:14 crc kubenswrapper[4749]: I0320 07:29:14.099368 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6djj\" (UniqueName: \"kubernetes.io/projected/1d4662da-8993-4e3e-a71f-482be0e247a1-kube-api-access-j6djj\") pod \"openstack-operator-index-xr8zb\" (UID: \"1d4662da-8993-4e3e-a71f-482be0e247a1\") " pod="openstack-operators/openstack-operator-index-xr8zb" Mar 20 07:29:14 crc kubenswrapper[4749]: I0320 07:29:14.135789 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6djj\" (UniqueName: \"kubernetes.io/projected/1d4662da-8993-4e3e-a71f-482be0e247a1-kube-api-access-j6djj\") pod \"openstack-operator-index-xr8zb\" (UID: \"1d4662da-8993-4e3e-a71f-482be0e247a1\") " pod="openstack-operators/openstack-operator-index-xr8zb" Mar 20 07:29:14 crc kubenswrapper[4749]: I0320 07:29:14.226928 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xr8zb" Mar 20 07:29:14 crc kubenswrapper[4749]: I0320 07:29:14.539945 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xr8zb"] Mar 20 07:29:14 crc kubenswrapper[4749]: W0320 07:29:14.541249 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d4662da_8993_4e3e_a71f_482be0e247a1.slice/crio-4ee0bee5946c9421f9ddcf972702095e0d9610fbca22b5039ce291400bdc741f WatchSource:0}: Error finding container 4ee0bee5946c9421f9ddcf972702095e0d9610fbca22b5039ce291400bdc741f: Status 404 returned error can't find the container with id 4ee0bee5946c9421f9ddcf972702095e0d9610fbca22b5039ce291400bdc741f Mar 20 07:29:15 crc kubenswrapper[4749]: I0320 07:29:15.050016 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xr8zb" event={"ID":"1d4662da-8993-4e3e-a71f-482be0e247a1","Type":"ContainerStarted","Data":"4ee0bee5946c9421f9ddcf972702095e0d9610fbca22b5039ce291400bdc741f"} Mar 20 07:29:15 crc kubenswrapper[4749]: I0320 07:29:15.630365 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qcw9v" Mar 20 07:29:15 crc kubenswrapper[4749]: I0320 07:29:15.631660 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qcw9v" Mar 20 07:29:15 crc kubenswrapper[4749]: I0320 07:29:15.674140 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qcw9v" Mar 20 07:29:16 crc kubenswrapper[4749]: I0320 07:29:16.135377 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qcw9v" Mar 20 07:29:16 crc kubenswrapper[4749]: I0320 07:29:16.248153 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-5ww9w" Mar 20 07:29:17 crc kubenswrapper[4749]: I0320 07:29:17.068571 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xr8zb" event={"ID":"1d4662da-8993-4e3e-a71f-482be0e247a1","Type":"ContainerStarted","Data":"cd80b4ebecc791d7377650e18720c39515df5ab44e91a0a12f69ca54f0888821"} Mar 20 07:29:17 crc kubenswrapper[4749]: I0320 07:29:17.087356 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-xr8zb" podStartSLOduration=1.802846924 podStartE2EDuration="4.087333172s" podCreationTimestamp="2026-03-20 07:29:13 +0000 UTC" firstStartedPulling="2026-03-20 07:29:14.547002961 +0000 UTC m=+991.096660618" lastFinishedPulling="2026-03-20 07:29:16.831489179 +0000 UTC m=+993.381146866" observedRunningTime="2026-03-20 07:29:17.081765086 +0000 UTC m=+993.631422773" watchObservedRunningTime="2026-03-20 07:29:17.087333172 +0000 UTC m=+993.636990859" Mar 20 07:29:18 crc kubenswrapper[4749]: I0320 07:29:18.266860 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-xr8zb"] Mar 20 07:29:18 crc kubenswrapper[4749]: I0320 07:29:18.879600 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-zm7gn"] Mar 20 07:29:18 crc kubenswrapper[4749]: I0320 07:29:18.880704 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zm7gn" Mar 20 07:29:18 crc kubenswrapper[4749]: I0320 07:29:18.890084 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zm7gn"] Mar 20 07:29:18 crc kubenswrapper[4749]: I0320 07:29:18.961324 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg9wx\" (UniqueName: \"kubernetes.io/projected/14b1dfbd-5576-43ed-b482-da48c031840a-kube-api-access-dg9wx\") pod \"openstack-operator-index-zm7gn\" (UID: \"14b1dfbd-5576-43ed-b482-da48c031840a\") " pod="openstack-operators/openstack-operator-index-zm7gn" Mar 20 07:29:19 crc kubenswrapper[4749]: I0320 07:29:19.063397 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg9wx\" (UniqueName: \"kubernetes.io/projected/14b1dfbd-5576-43ed-b482-da48c031840a-kube-api-access-dg9wx\") pod \"openstack-operator-index-zm7gn\" (UID: \"14b1dfbd-5576-43ed-b482-da48c031840a\") " pod="openstack-operators/openstack-operator-index-zm7gn" Mar 20 07:29:19 crc kubenswrapper[4749]: I0320 07:29:19.082139 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-xr8zb" podUID="1d4662da-8993-4e3e-a71f-482be0e247a1" containerName="registry-server" containerID="cri-o://cd80b4ebecc791d7377650e18720c39515df5ab44e91a0a12f69ca54f0888821" gracePeriod=2 Mar 20 07:29:19 crc kubenswrapper[4749]: I0320 07:29:19.088073 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg9wx\" (UniqueName: \"kubernetes.io/projected/14b1dfbd-5576-43ed-b482-da48c031840a-kube-api-access-dg9wx\") pod \"openstack-operator-index-zm7gn\" (UID: \"14b1dfbd-5576-43ed-b482-da48c031840a\") " pod="openstack-operators/openstack-operator-index-zm7gn" Mar 20 07:29:19 crc kubenswrapper[4749]: I0320 07:29:19.199474 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zm7gn" Mar 20 07:29:19 crc kubenswrapper[4749]: I0320 07:29:19.556254 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xr8zb" Mar 20 07:29:19 crc kubenswrapper[4749]: I0320 07:29:19.655608 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zm7gn"] Mar 20 07:29:19 crc kubenswrapper[4749]: W0320 07:29:19.656892 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14b1dfbd_5576_43ed_b482_da48c031840a.slice/crio-362faeef45e3fa25c7e601d5b1edbb132dad1f12695cc03f4439aa5c6faf8c79 WatchSource:0}: Error finding container 362faeef45e3fa25c7e601d5b1edbb132dad1f12695cc03f4439aa5c6faf8c79: Status 404 returned error can't find the container with id 362faeef45e3fa25c7e601d5b1edbb132dad1f12695cc03f4439aa5c6faf8c79 Mar 20 07:29:19 crc kubenswrapper[4749]: I0320 07:29:19.666405 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qcw9v"] Mar 20 07:29:19 crc kubenswrapper[4749]: I0320 07:29:19.666635 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qcw9v" podUID="2031d0a6-251a-4d48-b42a-25d55600ddb8" containerName="registry-server" containerID="cri-o://4f85c24eb6e1da2124e7415598871873481c1619828a0358f4b0ed9d872f6802" gracePeriod=2 Mar 20 07:29:19 crc kubenswrapper[4749]: I0320 07:29:19.672149 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6djj\" (UniqueName: \"kubernetes.io/projected/1d4662da-8993-4e3e-a71f-482be0e247a1-kube-api-access-j6djj\") pod \"1d4662da-8993-4e3e-a71f-482be0e247a1\" (UID: \"1d4662da-8993-4e3e-a71f-482be0e247a1\") " Mar 20 07:29:19 crc kubenswrapper[4749]: I0320 07:29:19.676751 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d4662da-8993-4e3e-a71f-482be0e247a1-kube-api-access-j6djj" (OuterVolumeSpecName: "kube-api-access-j6djj") pod "1d4662da-8993-4e3e-a71f-482be0e247a1" (UID: "1d4662da-8993-4e3e-a71f-482be0e247a1"). InnerVolumeSpecName "kube-api-access-j6djj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:29:19 crc kubenswrapper[4749]: I0320 07:29:19.774082 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6djj\" (UniqueName: \"kubernetes.io/projected/1d4662da-8993-4e3e-a71f-482be0e247a1-kube-api-access-j6djj\") on node \"crc\" DevicePath \"\"" Mar 20 07:29:19 crc kubenswrapper[4749]: I0320 07:29:19.991914 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qcw9v" Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.076903 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2031d0a6-251a-4d48-b42a-25d55600ddb8-catalog-content\") pod \"2031d0a6-251a-4d48-b42a-25d55600ddb8\" (UID: \"2031d0a6-251a-4d48-b42a-25d55600ddb8\") " Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.077011 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2031d0a6-251a-4d48-b42a-25d55600ddb8-utilities\") pod \"2031d0a6-251a-4d48-b42a-25d55600ddb8\" (UID: \"2031d0a6-251a-4d48-b42a-25d55600ddb8\") " Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.077044 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bktmb\" (UniqueName: \"kubernetes.io/projected/2031d0a6-251a-4d48-b42a-25d55600ddb8-kube-api-access-bktmb\") pod \"2031d0a6-251a-4d48-b42a-25d55600ddb8\" (UID: \"2031d0a6-251a-4d48-b42a-25d55600ddb8\") " Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.078200 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2031d0a6-251a-4d48-b42a-25d55600ddb8-utilities" (OuterVolumeSpecName: "utilities") pod "2031d0a6-251a-4d48-b42a-25d55600ddb8" (UID: "2031d0a6-251a-4d48-b42a-25d55600ddb8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.081858 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2031d0a6-251a-4d48-b42a-25d55600ddb8-kube-api-access-bktmb" (OuterVolumeSpecName: "kube-api-access-bktmb") pod "2031d0a6-251a-4d48-b42a-25d55600ddb8" (UID: "2031d0a6-251a-4d48-b42a-25d55600ddb8"). InnerVolumeSpecName "kube-api-access-bktmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.095876 4749 generic.go:334] "Generic (PLEG): container finished" podID="2031d0a6-251a-4d48-b42a-25d55600ddb8" containerID="4f85c24eb6e1da2124e7415598871873481c1619828a0358f4b0ed9d872f6802" exitCode=0 Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.096055 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qcw9v" Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.096533 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcw9v" event={"ID":"2031d0a6-251a-4d48-b42a-25d55600ddb8","Type":"ContainerDied","Data":"4f85c24eb6e1da2124e7415598871873481c1619828a0358f4b0ed9d872f6802"} Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.096571 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcw9v" event={"ID":"2031d0a6-251a-4d48-b42a-25d55600ddb8","Type":"ContainerDied","Data":"0dfbee909c28036beb859144714528aa6fbbdbaf427cdb8d789f9ef371e4ba6f"} Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.096591 4749 scope.go:117] "RemoveContainer" containerID="4f85c24eb6e1da2124e7415598871873481c1619828a0358f4b0ed9d872f6802" Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.100351 4749 generic.go:334] "Generic (PLEG): container finished" podID="1d4662da-8993-4e3e-a71f-482be0e247a1" containerID="cd80b4ebecc791d7377650e18720c39515df5ab44e91a0a12f69ca54f0888821" exitCode=0 Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.100449 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xr8zb" event={"ID":"1d4662da-8993-4e3e-a71f-482be0e247a1","Type":"ContainerDied","Data":"cd80b4ebecc791d7377650e18720c39515df5ab44e91a0a12f69ca54f0888821"} Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.100506 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xr8zb" event={"ID":"1d4662da-8993-4e3e-a71f-482be0e247a1","Type":"ContainerDied","Data":"4ee0bee5946c9421f9ddcf972702095e0d9610fbca22b5039ce291400bdc741f"} Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.100594 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xr8zb" Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.110600 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zm7gn" event={"ID":"14b1dfbd-5576-43ed-b482-da48c031840a","Type":"ContainerStarted","Data":"d3635055983b20031390feaca87535029fc7965018554f2bd61cd1f7b540ce04"} Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.110645 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zm7gn" event={"ID":"14b1dfbd-5576-43ed-b482-da48c031840a","Type":"ContainerStarted","Data":"362faeef45e3fa25c7e601d5b1edbb132dad1f12695cc03f4439aa5c6faf8c79"} Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.125185 4749 scope.go:117] "RemoveContainer" containerID="baa55d26664e90a9937436e405ae78410b11df5c4281460d963317882fdcaff2" Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.135360 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-zm7gn" podStartSLOduration=2.09348829 podStartE2EDuration="2.135340178s" podCreationTimestamp="2026-03-20 07:29:18 +0000 UTC" firstStartedPulling="2026-03-20 07:29:19.660595663 +0000 UTC m=+996.210253320" lastFinishedPulling="2026-03-20 07:29:19.702447561 +0000 UTC m=+996.252105208" observedRunningTime="2026-03-20 07:29:20.132636253 +0000 UTC m=+996.682293910" watchObservedRunningTime="2026-03-20 07:29:20.135340178 +0000 UTC m=+996.684997835" Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.139649 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2031d0a6-251a-4d48-b42a-25d55600ddb8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2031d0a6-251a-4d48-b42a-25d55600ddb8" (UID: "2031d0a6-251a-4d48-b42a-25d55600ddb8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.142215 4749 scope.go:117] "RemoveContainer" containerID="f39923698ca770dadaef7ab8ae5e9001ddba5f6720b8ed6257641275efaf54bb" Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.148548 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-xr8zb"] Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.153503 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-xr8zb"] Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.165778 4749 scope.go:117] "RemoveContainer" containerID="4f85c24eb6e1da2124e7415598871873481c1619828a0358f4b0ed9d872f6802" Mar 20 07:29:20 crc kubenswrapper[4749]: E0320 07:29:20.166240 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f85c24eb6e1da2124e7415598871873481c1619828a0358f4b0ed9d872f6802\": container with ID starting with 4f85c24eb6e1da2124e7415598871873481c1619828a0358f4b0ed9d872f6802 not found: ID does not exist" containerID="4f85c24eb6e1da2124e7415598871873481c1619828a0358f4b0ed9d872f6802" Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.166274 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f85c24eb6e1da2124e7415598871873481c1619828a0358f4b0ed9d872f6802"} err="failed to get container status \"4f85c24eb6e1da2124e7415598871873481c1619828a0358f4b0ed9d872f6802\": rpc error: code = NotFound desc = could not find container \"4f85c24eb6e1da2124e7415598871873481c1619828a0358f4b0ed9d872f6802\": container with ID starting with 4f85c24eb6e1da2124e7415598871873481c1619828a0358f4b0ed9d872f6802 not found: ID does not exist" Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.166326 4749 scope.go:117] "RemoveContainer" containerID="baa55d26664e90a9937436e405ae78410b11df5c4281460d963317882fdcaff2" Mar 20 07:29:20 crc kubenswrapper[4749]: E0320 07:29:20.166636 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baa55d26664e90a9937436e405ae78410b11df5c4281460d963317882fdcaff2\": container with ID starting with baa55d26664e90a9937436e405ae78410b11df5c4281460d963317882fdcaff2 not found: ID does not exist" containerID="baa55d26664e90a9937436e405ae78410b11df5c4281460d963317882fdcaff2" Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.166657 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baa55d26664e90a9937436e405ae78410b11df5c4281460d963317882fdcaff2"} err="failed to get container status \"baa55d26664e90a9937436e405ae78410b11df5c4281460d963317882fdcaff2\": rpc error: code = NotFound desc = could not find container \"baa55d26664e90a9937436e405ae78410b11df5c4281460d963317882fdcaff2\": container with ID starting with baa55d26664e90a9937436e405ae78410b11df5c4281460d963317882fdcaff2 not found: ID does not exist" Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.166675 4749 scope.go:117] "RemoveContainer" containerID="f39923698ca770dadaef7ab8ae5e9001ddba5f6720b8ed6257641275efaf54bb" Mar 20 07:29:20 crc kubenswrapper[4749]: E0320 07:29:20.166960 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f39923698ca770dadaef7ab8ae5e9001ddba5f6720b8ed6257641275efaf54bb\": container with ID starting with f39923698ca770dadaef7ab8ae5e9001ddba5f6720b8ed6257641275efaf54bb not found: ID does not exist" containerID="f39923698ca770dadaef7ab8ae5e9001ddba5f6720b8ed6257641275efaf54bb" Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.166986 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f39923698ca770dadaef7ab8ae5e9001ddba5f6720b8ed6257641275efaf54bb"} err="failed to get container status \"f39923698ca770dadaef7ab8ae5e9001ddba5f6720b8ed6257641275efaf54bb\": rpc error: code = NotFound desc = could not find container \"f39923698ca770dadaef7ab8ae5e9001ddba5f6720b8ed6257641275efaf54bb\": container with ID starting with f39923698ca770dadaef7ab8ae5e9001ddba5f6720b8ed6257641275efaf54bb not found: ID does not exist" Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.167001 4749 scope.go:117] "RemoveContainer" containerID="cd80b4ebecc791d7377650e18720c39515df5ab44e91a0a12f69ca54f0888821" Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.178132 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2031d0a6-251a-4d48-b42a-25d55600ddb8-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.178162 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bktmb\" (UniqueName: \"kubernetes.io/projected/2031d0a6-251a-4d48-b42a-25d55600ddb8-kube-api-access-bktmb\") on node \"crc\" DevicePath \"\"" Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.178176 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2031d0a6-251a-4d48-b42a-25d55600ddb8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.188675 4749 scope.go:117] "RemoveContainer" containerID="cd80b4ebecc791d7377650e18720c39515df5ab44e91a0a12f69ca54f0888821" Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.190957 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d4662da-8993-4e3e-a71f-482be0e247a1" path="/var/lib/kubelet/pods/1d4662da-8993-4e3e-a71f-482be0e247a1/volumes" Mar 20 07:29:20 crc kubenswrapper[4749]: E0320 07:29:20.193693 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd80b4ebecc791d7377650e18720c39515df5ab44e91a0a12f69ca54f0888821\": container with ID starting with cd80b4ebecc791d7377650e18720c39515df5ab44e91a0a12f69ca54f0888821 not found: ID does not exist" containerID="cd80b4ebecc791d7377650e18720c39515df5ab44e91a0a12f69ca54f0888821" Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.193736 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd80b4ebecc791d7377650e18720c39515df5ab44e91a0a12f69ca54f0888821"} err="failed to get container status \"cd80b4ebecc791d7377650e18720c39515df5ab44e91a0a12f69ca54f0888821\": rpc error: code = NotFound desc = could not find container \"cd80b4ebecc791d7377650e18720c39515df5ab44e91a0a12f69ca54f0888821\": container with ID starting with cd80b4ebecc791d7377650e18720c39515df5ab44e91a0a12f69ca54f0888821 not found: ID does not exist" Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.423204 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qcw9v"] Mar 20 07:29:20 crc kubenswrapper[4749]: I0320 07:29:20.431601 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qcw9v"] Mar 20 07:29:22 crc kubenswrapper[4749]: I0320 07:29:22.189990 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2031d0a6-251a-4d48-b42a-25d55600ddb8" path="/var/lib/kubelet/pods/2031d0a6-251a-4d48-b42a-25d55600ddb8/volumes" Mar 20 07:29:29 crc kubenswrapper[4749]: I0320 07:29:29.200174 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-zm7gn" Mar 20 07:29:29 crc kubenswrapper[4749]: I0320 07:29:29.200860 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-zm7gn" Mar 20 07:29:29 crc kubenswrapper[4749]: I0320 07:29:29.242928 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-zm7gn" Mar 20 07:29:30 crc kubenswrapper[4749]: I0320 07:29:30.229991 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-zm7gn" Mar 20 07:29:31 crc kubenswrapper[4749]: I0320 07:29:31.320333 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7"] Mar 20 07:29:31 crc kubenswrapper[4749]: E0320 07:29:31.320611 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2031d0a6-251a-4d48-b42a-25d55600ddb8" containerName="registry-server" Mar 20 07:29:31 crc kubenswrapper[4749]: I0320 07:29:31.320627 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2031d0a6-251a-4d48-b42a-25d55600ddb8" containerName="registry-server" Mar 20 07:29:31 crc kubenswrapper[4749]: E0320 07:29:31.320643 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2031d0a6-251a-4d48-b42a-25d55600ddb8" containerName="extract-utilities" Mar 20 07:29:31 crc kubenswrapper[4749]: I0320 07:29:31.320653 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2031d0a6-251a-4d48-b42a-25d55600ddb8" containerName="extract-utilities" Mar 20 07:29:31 crc kubenswrapper[4749]: E0320 07:29:31.320663 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2031d0a6-251a-4d48-b42a-25d55600ddb8" containerName="extract-content" Mar 20 07:29:31 crc kubenswrapper[4749]: I0320 07:29:31.320671 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2031d0a6-251a-4d48-b42a-25d55600ddb8" containerName="extract-content" Mar 20 07:29:31 crc kubenswrapper[4749]: E0320 07:29:31.320681 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d4662da-8993-4e3e-a71f-482be0e247a1" containerName="registry-server" Mar 20 07:29:31 crc kubenswrapper[4749]: I0320 07:29:31.320689 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d4662da-8993-4e3e-a71f-482be0e247a1" containerName="registry-server" Mar 20 07:29:31 crc kubenswrapper[4749]: I0320 07:29:31.320819 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2031d0a6-251a-4d48-b42a-25d55600ddb8" containerName="registry-server" Mar 20 07:29:31 crc kubenswrapper[4749]: I0320 07:29:31.320841 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d4662da-8993-4e3e-a71f-482be0e247a1" containerName="registry-server" Mar 20 07:29:31 crc kubenswrapper[4749]: I0320 07:29:31.321813 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7" Mar 20 07:29:31 crc kubenswrapper[4749]: I0320 07:29:31.326403 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-nbgpt" Mar 20 07:29:31 crc kubenswrapper[4749]: I0320 07:29:31.367979 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7"] Mar 20 07:29:31 crc kubenswrapper[4749]: I0320 07:29:31.447150 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1525cb6-0a04-499c-8737-81f1981815da-bundle\") pod \"be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7\" (UID: \"a1525cb6-0a04-499c-8737-81f1981815da\") " pod="openstack-operators/be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7" Mar 20 07:29:31 crc kubenswrapper[4749]: I0320 07:29:31.447214 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rn9x\" (UniqueName: \"kubernetes.io/projected/a1525cb6-0a04-499c-8737-81f1981815da-kube-api-access-8rn9x\") pod \"be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7\" (UID: \"a1525cb6-0a04-499c-8737-81f1981815da\") " pod="openstack-operators/be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7" Mar 20 07:29:31 crc kubenswrapper[4749]: I0320 07:29:31.447266 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1525cb6-0a04-499c-8737-81f1981815da-util\") pod \"be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7\" (UID: \"a1525cb6-0a04-499c-8737-81f1981815da\") " pod="openstack-operators/be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7" Mar 20 07:29:31 crc kubenswrapper[4749]: I0320 07:29:31.549017 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1525cb6-0a04-499c-8737-81f1981815da-bundle\") pod \"be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7\" (UID: \"a1525cb6-0a04-499c-8737-81f1981815da\") " pod="openstack-operators/be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7" Mar 20 07:29:31 crc kubenswrapper[4749]: I0320 07:29:31.549473 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rn9x\" (UniqueName: \"kubernetes.io/projected/a1525cb6-0a04-499c-8737-81f1981815da-kube-api-access-8rn9x\") pod \"be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7\" (UID: \"a1525cb6-0a04-499c-8737-81f1981815da\") " pod="openstack-operators/be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7" Mar 20 07:29:31 crc kubenswrapper[4749]: I0320 07:29:31.549702 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1525cb6-0a04-499c-8737-81f1981815da-util\") pod \"be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7\" (UID: \"a1525cb6-0a04-499c-8737-81f1981815da\") " pod="openstack-operators/be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7" Mar 20 07:29:31 crc kubenswrapper[4749]: I0320 07:29:31.550202 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1525cb6-0a04-499c-8737-81f1981815da-bundle\") pod \"be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7\" (UID: \"a1525cb6-0a04-499c-8737-81f1981815da\") " pod="openstack-operators/be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7" Mar 20 07:29:31 crc kubenswrapper[4749]: I0320 07:29:31.550922 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1525cb6-0a04-499c-8737-81f1981815da-util\") pod \"be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7\" (UID: \"a1525cb6-0a04-499c-8737-81f1981815da\") " pod="openstack-operators/be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7" Mar 20 07:29:31 crc kubenswrapper[4749]: I0320 07:29:31.585749 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rn9x\" (UniqueName: \"kubernetes.io/projected/a1525cb6-0a04-499c-8737-81f1981815da-kube-api-access-8rn9x\") pod \"be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7\" (UID: \"a1525cb6-0a04-499c-8737-81f1981815da\") " pod="openstack-operators/be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7" Mar 20 07:29:31 crc kubenswrapper[4749]: I0320 07:29:31.653979 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7" Mar 20 07:29:32 crc kubenswrapper[4749]: I0320 07:29:32.090343 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7"] Mar 20 07:29:32 crc kubenswrapper[4749]: I0320 07:29:32.207835 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7" event={"ID":"a1525cb6-0a04-499c-8737-81f1981815da","Type":"ContainerStarted","Data":"289bba42305ad3187de3ec3148a6f66aba1064115166955da3460ce30b7811e6"} Mar 20 07:29:33 crc kubenswrapper[4749]: I0320 07:29:33.215726 4749 generic.go:334] "Generic (PLEG): container finished" podID="a1525cb6-0a04-499c-8737-81f1981815da" containerID="0b86a0225ac3e2890b2accd1b0a13cf3c1de8a23b5a7a3ca95d3424c54705428" exitCode=0 Mar 20 07:29:33 crc kubenswrapper[4749]: I0320 07:29:33.215785 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7" event={"ID":"a1525cb6-0a04-499c-8737-81f1981815da","Type":"ContainerDied","Data":"0b86a0225ac3e2890b2accd1b0a13cf3c1de8a23b5a7a3ca95d3424c54705428"} Mar 20 07:29:34 crc kubenswrapper[4749]: I0320 07:29:34.228947 4749 generic.go:334] "Generic (PLEG): container finished" podID="a1525cb6-0a04-499c-8737-81f1981815da" containerID="98924085c263b1f03130e8a173c2fe714d8cfdb3200b752d1b89ef34ef246ac2" exitCode=0 Mar 20 07:29:34 crc kubenswrapper[4749]: I0320 07:29:34.229111 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7" event={"ID":"a1525cb6-0a04-499c-8737-81f1981815da","Type":"ContainerDied","Data":"98924085c263b1f03130e8a173c2fe714d8cfdb3200b752d1b89ef34ef246ac2"} Mar 20 07:29:34 crc kubenswrapper[4749]: I0320 07:29:34.514760 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:29:34 crc kubenswrapper[4749]: I0320 07:29:34.515075 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:29:34 crc kubenswrapper[4749]: I0320 07:29:34.515121 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:29:34 crc kubenswrapper[4749]: I0320 07:29:34.515768 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3c74897c54ef7454cef1084b8e06312bda867ecfea849b2a4ba3d53fa61618a4"} pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 07:29:34 crc kubenswrapper[4749]: I0320 07:29:34.515846 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" containerID="cri-o://3c74897c54ef7454cef1084b8e06312bda867ecfea849b2a4ba3d53fa61618a4" gracePeriod=600 Mar 20 07:29:35 crc kubenswrapper[4749]: I0320 07:29:35.238731 4749 generic.go:334] "Generic (PLEG): container finished" podID="a1525cb6-0a04-499c-8737-81f1981815da" containerID="66cf578332a6ab323e9257dd4a24a4e6a973d78e82e90f9a6c15b3ee96d7370e" exitCode=0 Mar 20 07:29:35 crc kubenswrapper[4749]: I0320 07:29:35.239810 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7" event={"ID":"a1525cb6-0a04-499c-8737-81f1981815da","Type":"ContainerDied","Data":"66cf578332a6ab323e9257dd4a24a4e6a973d78e82e90f9a6c15b3ee96d7370e"} Mar 20 07:29:35 crc kubenswrapper[4749]: I0320 07:29:35.242881 4749 generic.go:334] "Generic (PLEG): container finished" podID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerID="3c74897c54ef7454cef1084b8e06312bda867ecfea849b2a4ba3d53fa61618a4" exitCode=0 Mar 20 07:29:35 crc kubenswrapper[4749]: I0320 07:29:35.242941 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerDied","Data":"3c74897c54ef7454cef1084b8e06312bda867ecfea849b2a4ba3d53fa61618a4"} Mar 20 07:29:35 crc kubenswrapper[4749]: I0320 07:29:35.243346 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerStarted","Data":"72a24d5f0786b3da9aac01d553c981fdcf13ebc1b2358317a489547c93d570db"} Mar 20 07:29:35 crc kubenswrapper[4749]: I0320 07:29:35.243459 4749 scope.go:117] "RemoveContainer" containerID="5e762cc7631bdd3af893f7f6b529361ab634432f285e3c9f01638e40b5f29d64" Mar 20 07:29:36 crc kubenswrapper[4749]: I0320 07:29:36.565952 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7" Mar 20 07:29:36 crc kubenswrapper[4749]: I0320 07:29:36.620317 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1525cb6-0a04-499c-8737-81f1981815da-bundle\") pod \"a1525cb6-0a04-499c-8737-81f1981815da\" (UID: \"a1525cb6-0a04-499c-8737-81f1981815da\") " Mar 20 07:29:36 crc kubenswrapper[4749]: I0320 07:29:36.620437 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rn9x\" (UniqueName: \"kubernetes.io/projected/a1525cb6-0a04-499c-8737-81f1981815da-kube-api-access-8rn9x\") pod \"a1525cb6-0a04-499c-8737-81f1981815da\" (UID: \"a1525cb6-0a04-499c-8737-81f1981815da\") " Mar 20 07:29:36 crc kubenswrapper[4749]: I0320 07:29:36.620524 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1525cb6-0a04-499c-8737-81f1981815da-util\") pod \"a1525cb6-0a04-499c-8737-81f1981815da\" (UID: \"a1525cb6-0a04-499c-8737-81f1981815da\") " Mar 20 07:29:36 crc kubenswrapper[4749]: I0320 07:29:36.621102 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1525cb6-0a04-499c-8737-81f1981815da-bundle" (OuterVolumeSpecName: "bundle") pod "a1525cb6-0a04-499c-8737-81f1981815da" (UID: "a1525cb6-0a04-499c-8737-81f1981815da"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:29:36 crc kubenswrapper[4749]: I0320 07:29:36.629133 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1525cb6-0a04-499c-8737-81f1981815da-kube-api-access-8rn9x" (OuterVolumeSpecName: "kube-api-access-8rn9x") pod "a1525cb6-0a04-499c-8737-81f1981815da" (UID: "a1525cb6-0a04-499c-8737-81f1981815da"). InnerVolumeSpecName "kube-api-access-8rn9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:29:36 crc kubenswrapper[4749]: I0320 07:29:36.639210 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1525cb6-0a04-499c-8737-81f1981815da-util" (OuterVolumeSpecName: "util") pod "a1525cb6-0a04-499c-8737-81f1981815da" (UID: "a1525cb6-0a04-499c-8737-81f1981815da"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:29:36 crc kubenswrapper[4749]: I0320 07:29:36.722590 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1525cb6-0a04-499c-8737-81f1981815da-util\") on node \"crc\" DevicePath \"\"" Mar 20 07:29:36 crc kubenswrapper[4749]: I0320 07:29:36.722632 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1525cb6-0a04-499c-8737-81f1981815da-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:29:36 crc kubenswrapper[4749]: I0320 07:29:36.722647 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rn9x\" (UniqueName: \"kubernetes.io/projected/a1525cb6-0a04-499c-8737-81f1981815da-kube-api-access-8rn9x\") on node \"crc\" DevicePath \"\"" Mar 20 07:29:37 crc kubenswrapper[4749]: I0320 07:29:37.273734 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7" event={"ID":"a1525cb6-0a04-499c-8737-81f1981815da","Type":"ContainerDied","Data":"289bba42305ad3187de3ec3148a6f66aba1064115166955da3460ce30b7811e6"} Mar 20 07:29:37 crc kubenswrapper[4749]: I0320 07:29:37.273811 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="289bba42305ad3187de3ec3148a6f66aba1064115166955da3460ce30b7811e6" Mar 20 07:29:37 crc kubenswrapper[4749]: I0320 07:29:37.273894 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7" Mar 20 07:29:43 crc kubenswrapper[4749]: I0320 07:29:43.347155 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-7f68b6bcd8-5cmds"] Mar 20 07:29:43 crc kubenswrapper[4749]: E0320 07:29:43.347858 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1525cb6-0a04-499c-8737-81f1981815da" containerName="extract" Mar 20 07:29:43 crc kubenswrapper[4749]: I0320 07:29:43.347869 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1525cb6-0a04-499c-8737-81f1981815da" containerName="extract" Mar 20 07:29:43 crc kubenswrapper[4749]: E0320 07:29:43.347883 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1525cb6-0a04-499c-8737-81f1981815da" containerName="util" Mar 20 07:29:43 crc kubenswrapper[4749]: I0320 07:29:43.347888 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1525cb6-0a04-499c-8737-81f1981815da" containerName="util" Mar 20 07:29:43 crc kubenswrapper[4749]: E0320 07:29:43.347895 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1525cb6-0a04-499c-8737-81f1981815da" containerName="pull" Mar 20 07:29:43 crc kubenswrapper[4749]: I0320 07:29:43.347901 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1525cb6-0a04-499c-8737-81f1981815da" containerName="pull" Mar 20 07:29:43 crc kubenswrapper[4749]: I0320 07:29:43.348005 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1525cb6-0a04-499c-8737-81f1981815da" containerName="extract" Mar 20 07:29:43 crc kubenswrapper[4749]: I0320 07:29:43.348389 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7f68b6bcd8-5cmds" Mar 20 07:29:43 crc kubenswrapper[4749]: W0320 07:29:43.350409 4749 reflector.go:561] object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-v8n7f": failed to list *v1.Secret: secrets "openstack-operator-controller-init-dockercfg-v8n7f" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Mar 20 07:29:43 crc kubenswrapper[4749]: E0320 07:29:43.350489 4749 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"openstack-operator-controller-init-dockercfg-v8n7f\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openstack-operator-controller-init-dockercfg-v8n7f\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 07:29:43 crc kubenswrapper[4749]: I0320 07:29:43.386470 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7f68b6bcd8-5cmds"] Mar 20 07:29:43 crc kubenswrapper[4749]: I0320 07:29:43.404964 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shrbn\" (UniqueName: \"kubernetes.io/projected/1cf0a010-6087-4784-8303-8be78ad550e1-kube-api-access-shrbn\") pod \"openstack-operator-controller-init-7f68b6bcd8-5cmds\" (UID: \"1cf0a010-6087-4784-8303-8be78ad550e1\") " pod="openstack-operators/openstack-operator-controller-init-7f68b6bcd8-5cmds" Mar 20 07:29:43 crc kubenswrapper[4749]: I0320 07:29:43.506517 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shrbn\" (UniqueName: \"kubernetes.io/projected/1cf0a010-6087-4784-8303-8be78ad550e1-kube-api-access-shrbn\") pod \"openstack-operator-controller-init-7f68b6bcd8-5cmds\" (UID: \"1cf0a010-6087-4784-8303-8be78ad550e1\") " pod="openstack-operators/openstack-operator-controller-init-7f68b6bcd8-5cmds" Mar 20 07:29:43 crc kubenswrapper[4749]: I0320 07:29:43.524438 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shrbn\" (UniqueName: \"kubernetes.io/projected/1cf0a010-6087-4784-8303-8be78ad550e1-kube-api-access-shrbn\") pod \"openstack-operator-controller-init-7f68b6bcd8-5cmds\" (UID: \"1cf0a010-6087-4784-8303-8be78ad550e1\") " pod="openstack-operators/openstack-operator-controller-init-7f68b6bcd8-5cmds" Mar 20 07:29:44 crc kubenswrapper[4749]: I0320 07:29:44.199932 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-v8n7f" Mar 20 07:29:44 crc kubenswrapper[4749]: I0320 07:29:44.205025 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7f68b6bcd8-5cmds" Mar 20 07:29:44 crc kubenswrapper[4749]: I0320 07:29:44.467103 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7f68b6bcd8-5cmds"] Mar 20 07:29:45 crc kubenswrapper[4749]: I0320 07:29:45.325336 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7f68b6bcd8-5cmds" event={"ID":"1cf0a010-6087-4784-8303-8be78ad550e1","Type":"ContainerStarted","Data":"280342fa69b3832218f82c0a2921afda78f4a728f7f1ffe6d686d08e712bce28"} Mar 20 07:29:49 crc kubenswrapper[4749]: I0320 07:29:49.357270 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7f68b6bcd8-5cmds" event={"ID":"1cf0a010-6087-4784-8303-8be78ad550e1","Type":"ContainerStarted","Data":"2e32779f993aa2968a1812c8fcd620c5cbdc6b4ae4f56ca56f1e9b1a8a865e8e"} Mar 20 07:29:49 crc kubenswrapper[4749]: I0320 07:29:49.358010 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-7f68b6bcd8-5cmds" Mar 20 07:29:49 crc kubenswrapper[4749]: I0320 07:29:49.413217 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-7f68b6bcd8-5cmds" podStartSLOduration=2.661439997 podStartE2EDuration="6.413183798s" podCreationTimestamp="2026-03-20 07:29:43 +0000 UTC" firstStartedPulling="2026-03-20 07:29:44.475382883 +0000 UTC m=+1021.025040570" lastFinishedPulling="2026-03-20 07:29:48.227126724 +0000 UTC m=+1024.776784371" observedRunningTime="2026-03-20 07:29:49.406215659 +0000 UTC m=+1025.955873346" watchObservedRunningTime="2026-03-20 07:29:49.413183798 +0000 UTC m=+1025.962841485" Mar 20 07:29:54 crc kubenswrapper[4749]: I0320 07:29:54.211693 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-7f68b6bcd8-5cmds" Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.185665 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566530-j4mbn"] Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.186792 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566530-w5ddv"] Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.186958 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566530-j4mbn" Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.187389 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w5ddv" Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.192326 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.192589 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.193898 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.193979 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.194033 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.212856 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566530-j4mbn"] Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.218371 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566530-w5ddv"] Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.241025 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c0a353f-02c0-4246-b879-02a58abaf589-config-volume\") pod \"collect-profiles-29566530-w5ddv\" (UID: \"6c0a353f-02c0-4246-b879-02a58abaf589\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w5ddv" Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.241140 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp5n7\" (UniqueName: \"kubernetes.io/projected/6c0a353f-02c0-4246-b879-02a58abaf589-kube-api-access-jp5n7\") pod \"collect-profiles-29566530-w5ddv\" (UID: \"6c0a353f-02c0-4246-b879-02a58abaf589\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w5ddv" Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.241198 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4fjf\" (UniqueName: \"kubernetes.io/projected/98099ee0-eea2-4964-82cd-f8a255f33811-kube-api-access-h4fjf\") pod \"auto-csr-approver-29566530-j4mbn\" (UID: \"98099ee0-eea2-4964-82cd-f8a255f33811\") " pod="openshift-infra/auto-csr-approver-29566530-j4mbn" Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.241240 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c0a353f-02c0-4246-b879-02a58abaf589-secret-volume\") pod \"collect-profiles-29566530-w5ddv\" (UID: \"6c0a353f-02c0-4246-b879-02a58abaf589\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w5ddv" Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.341883 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp5n7\" (UniqueName: \"kubernetes.io/projected/6c0a353f-02c0-4246-b879-02a58abaf589-kube-api-access-jp5n7\") pod \"collect-profiles-29566530-w5ddv\" (UID: \"6c0a353f-02c0-4246-b879-02a58abaf589\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w5ddv" Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.341928 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4fjf\" (UniqueName: \"kubernetes.io/projected/98099ee0-eea2-4964-82cd-f8a255f33811-kube-api-access-h4fjf\") pod \"auto-csr-approver-29566530-j4mbn\" (UID: \"98099ee0-eea2-4964-82cd-f8a255f33811\") " pod="openshift-infra/auto-csr-approver-29566530-j4mbn" Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.341953 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c0a353f-02c0-4246-b879-02a58abaf589-secret-volume\") pod \"collect-profiles-29566530-w5ddv\" (UID: \"6c0a353f-02c0-4246-b879-02a58abaf589\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w5ddv" Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.342004 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c0a353f-02c0-4246-b879-02a58abaf589-config-volume\") pod \"collect-profiles-29566530-w5ddv\" (UID: \"6c0a353f-02c0-4246-b879-02a58abaf589\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w5ddv" Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.342898 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c0a353f-02c0-4246-b879-02a58abaf589-config-volume\") pod \"collect-profiles-29566530-w5ddv\" (UID: \"6c0a353f-02c0-4246-b879-02a58abaf589\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w5ddv" Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.350924 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c0a353f-02c0-4246-b879-02a58abaf589-secret-volume\") pod \"collect-profiles-29566530-w5ddv\" (UID: \"6c0a353f-02c0-4246-b879-02a58abaf589\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w5ddv" Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.361605 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp5n7\" (UniqueName: \"kubernetes.io/projected/6c0a353f-02c0-4246-b879-02a58abaf589-kube-api-access-jp5n7\") pod \"collect-profiles-29566530-w5ddv\" (UID: \"6c0a353f-02c0-4246-b879-02a58abaf589\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w5ddv" Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.364412 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4fjf\" (UniqueName: \"kubernetes.io/projected/98099ee0-eea2-4964-82cd-f8a255f33811-kube-api-access-h4fjf\") pod \"auto-csr-approver-29566530-j4mbn\" (UID: \"98099ee0-eea2-4964-82cd-f8a255f33811\") " pod="openshift-infra/auto-csr-approver-29566530-j4mbn" Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.504375 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566530-j4mbn" Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.514618 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w5ddv" Mar 20 07:30:00 crc kubenswrapper[4749]: I0320 07:30:00.837785 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566530-j4mbn"] Mar 20 07:30:01 crc kubenswrapper[4749]: I0320 07:30:01.090359 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566530-w5ddv"] Mar 20 07:30:01 crc kubenswrapper[4749]: W0320 07:30:01.095511 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c0a353f_02c0_4246_b879_02a58abaf589.slice/crio-e88cbf2b9390d21e739a201d681dc83c9e5fff409e6c42df6d5be526573fe926 WatchSource:0}: Error finding container e88cbf2b9390d21e739a201d681dc83c9e5fff409e6c42df6d5be526573fe926: Status 404 returned error can't find the container with id e88cbf2b9390d21e739a201d681dc83c9e5fff409e6c42df6d5be526573fe926 Mar 20 07:30:01 crc kubenswrapper[4749]: I0320 07:30:01.445816 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w5ddv" event={"ID":"6c0a353f-02c0-4246-b879-02a58abaf589","Type":"ContainerStarted","Data":"31fa872ec8ff924497eb3f4301b16f5d76aaf69523454902a43e3fa81c14ed6a"} Mar 20 07:30:01 crc kubenswrapper[4749]: I0320 07:30:01.445867 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w5ddv" event={"ID":"6c0a353f-02c0-4246-b879-02a58abaf589","Type":"ContainerStarted","Data":"e88cbf2b9390d21e739a201d681dc83c9e5fff409e6c42df6d5be526573fe926"} Mar 20 07:30:01 crc kubenswrapper[4749]: I0320 07:30:01.447657 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566530-j4mbn" event={"ID":"98099ee0-eea2-4964-82cd-f8a255f33811","Type":"ContainerStarted","Data":"d2cf9b82df2755f9ee42ddf2d0ff574c16b8652a4b451ce123c4d27111d4ab2b"} Mar 20 07:30:02 crc kubenswrapper[4749]: I0320 07:30:02.453486 4749 generic.go:334] "Generic (PLEG): container finished" podID="6c0a353f-02c0-4246-b879-02a58abaf589" containerID="31fa872ec8ff924497eb3f4301b16f5d76aaf69523454902a43e3fa81c14ed6a" exitCode=0 Mar 20 07:30:02 crc kubenswrapper[4749]: I0320 07:30:02.453775 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w5ddv" event={"ID":"6c0a353f-02c0-4246-b879-02a58abaf589","Type":"ContainerDied","Data":"31fa872ec8ff924497eb3f4301b16f5d76aaf69523454902a43e3fa81c14ed6a"} Mar 20 07:30:03 crc kubenswrapper[4749]: I0320 07:30:03.466124 4749 generic.go:334] "Generic (PLEG): container finished" podID="98099ee0-eea2-4964-82cd-f8a255f33811" containerID="b269f05024d71e0720b806b8519a352e516cecac758dd3274ceb2a90f86cd520" exitCode=0 Mar 20 07:30:03 crc kubenswrapper[4749]: I0320 07:30:03.466351 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566530-j4mbn" event={"ID":"98099ee0-eea2-4964-82cd-f8a255f33811","Type":"ContainerDied","Data":"b269f05024d71e0720b806b8519a352e516cecac758dd3274ceb2a90f86cd520"} Mar 20 07:30:03 crc kubenswrapper[4749]: I0320 07:30:03.791445 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w5ddv" Mar 20 07:30:03 crc kubenswrapper[4749]: I0320 07:30:03.891253 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c0a353f-02c0-4246-b879-02a58abaf589-config-volume\") pod \"6c0a353f-02c0-4246-b879-02a58abaf589\" (UID: \"6c0a353f-02c0-4246-b879-02a58abaf589\") " Mar 20 07:30:03 crc kubenswrapper[4749]: I0320 07:30:03.891336 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c0a353f-02c0-4246-b879-02a58abaf589-secret-volume\") pod \"6c0a353f-02c0-4246-b879-02a58abaf589\" (UID: \"6c0a353f-02c0-4246-b879-02a58abaf589\") " Mar 20 07:30:03 crc kubenswrapper[4749]: I0320 07:30:03.891398 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp5n7\" (UniqueName: \"kubernetes.io/projected/6c0a353f-02c0-4246-b879-02a58abaf589-kube-api-access-jp5n7\") pod \"6c0a353f-02c0-4246-b879-02a58abaf589\" (UID: \"6c0a353f-02c0-4246-b879-02a58abaf589\") " Mar 20 07:30:03 crc kubenswrapper[4749]: I0320 07:30:03.892608 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c0a353f-02c0-4246-b879-02a58abaf589-config-volume" (OuterVolumeSpecName: "config-volume") pod "6c0a353f-02c0-4246-b879-02a58abaf589" (UID: "6c0a353f-02c0-4246-b879-02a58abaf589"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:30:03 crc kubenswrapper[4749]: I0320 07:30:03.896610 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c0a353f-02c0-4246-b879-02a58abaf589-kube-api-access-jp5n7" (OuterVolumeSpecName: "kube-api-access-jp5n7") pod "6c0a353f-02c0-4246-b879-02a58abaf589" (UID: "6c0a353f-02c0-4246-b879-02a58abaf589"). InnerVolumeSpecName "kube-api-access-jp5n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:30:03 crc kubenswrapper[4749]: I0320 07:30:03.897361 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c0a353f-02c0-4246-b879-02a58abaf589-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6c0a353f-02c0-4246-b879-02a58abaf589" (UID: "6c0a353f-02c0-4246-b879-02a58abaf589"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:30:03 crc kubenswrapper[4749]: I0320 07:30:03.992748 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c0a353f-02c0-4246-b879-02a58abaf589-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 07:30:03 crc kubenswrapper[4749]: I0320 07:30:03.992782 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jp5n7\" (UniqueName: \"kubernetes.io/projected/6c0a353f-02c0-4246-b879-02a58abaf589-kube-api-access-jp5n7\") on node \"crc\" DevicePath \"\"" Mar 20 07:30:03 crc kubenswrapper[4749]: I0320 07:30:03.992791 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c0a353f-02c0-4246-b879-02a58abaf589-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 07:30:04 crc kubenswrapper[4749]: I0320 07:30:04.474478 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w5ddv" event={"ID":"6c0a353f-02c0-4246-b879-02a58abaf589","Type":"ContainerDied","Data":"e88cbf2b9390d21e739a201d681dc83c9e5fff409e6c42df6d5be526573fe926"} Mar 20 07:30:04 crc kubenswrapper[4749]: I0320 07:30:04.474535 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e88cbf2b9390d21e739a201d681dc83c9e5fff409e6c42df6d5be526573fe926" Mar 20 07:30:04 crc kubenswrapper[4749]: I0320 07:30:04.474507 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566530-w5ddv" Mar 20 07:30:04 crc kubenswrapper[4749]: I0320 07:30:04.809807 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566530-j4mbn" Mar 20 07:30:05 crc kubenswrapper[4749]: I0320 07:30:05.008247 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4fjf\" (UniqueName: \"kubernetes.io/projected/98099ee0-eea2-4964-82cd-f8a255f33811-kube-api-access-h4fjf\") pod \"98099ee0-eea2-4964-82cd-f8a255f33811\" (UID: \"98099ee0-eea2-4964-82cd-f8a255f33811\") " Mar 20 07:30:05 crc kubenswrapper[4749]: I0320 07:30:05.012748 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98099ee0-eea2-4964-82cd-f8a255f33811-kube-api-access-h4fjf" (OuterVolumeSpecName: "kube-api-access-h4fjf") pod "98099ee0-eea2-4964-82cd-f8a255f33811" (UID: "98099ee0-eea2-4964-82cd-f8a255f33811"). InnerVolumeSpecName "kube-api-access-h4fjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:30:05 crc kubenswrapper[4749]: I0320 07:30:05.110160 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4fjf\" (UniqueName: \"kubernetes.io/projected/98099ee0-eea2-4964-82cd-f8a255f33811-kube-api-access-h4fjf\") on node \"crc\" DevicePath \"\"" Mar 20 07:30:05 crc kubenswrapper[4749]: I0320 07:30:05.483743 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566530-j4mbn" event={"ID":"98099ee0-eea2-4964-82cd-f8a255f33811","Type":"ContainerDied","Data":"d2cf9b82df2755f9ee42ddf2d0ff574c16b8652a4b451ce123c4d27111d4ab2b"} Mar 20 07:30:05 crc kubenswrapper[4749]: I0320 07:30:05.484871 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2cf9b82df2755f9ee42ddf2d0ff574c16b8652a4b451ce123c4d27111d4ab2b" Mar 20 07:30:05 crc kubenswrapper[4749]: I0320 07:30:05.484012 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566530-j4mbn" Mar 20 07:30:05 crc kubenswrapper[4749]: I0320 07:30:05.874186 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566524-csb8r"] Mar 20 07:30:05 crc kubenswrapper[4749]: I0320 07:30:05.879635 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566524-csb8r"] Mar 20 07:30:06 crc kubenswrapper[4749]: I0320 07:30:06.185224 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6d3a1eb-794e-4c9c-988b-9aef650c37b0" path="/var/lib/kubelet/pods/c6d3a1eb-794e-4c9c-988b-9aef650c37b0/volumes" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.100343 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-bbhfb"] Mar 20 07:30:14 crc kubenswrapper[4749]: E0320 07:30:14.102390 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c0a353f-02c0-4246-b879-02a58abaf589" containerName="collect-profiles" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.102476 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c0a353f-02c0-4246-b879-02a58abaf589" containerName="collect-profiles" Mar 20 07:30:14 crc kubenswrapper[4749]: E0320 07:30:14.102550 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98099ee0-eea2-4964-82cd-f8a255f33811" containerName="oc" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.102605 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="98099ee0-eea2-4964-82cd-f8a255f33811" containerName="oc" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.102755 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="98099ee0-eea2-4964-82cd-f8a255f33811" containerName="oc" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.102817 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c0a353f-02c0-4246-b879-02a58abaf589" containerName="collect-profiles" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.103329 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-bbhfb" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.105387 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-hd9hm" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.106967 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-gdxrh"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.107691 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-gdxrh" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.112945 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-bbhfb"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.115145 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-xc4x8" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.118481 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-mfsfk"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.119203 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mfsfk" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.121999 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-r5w4w" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.133540 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbhdq\" (UniqueName: \"kubernetes.io/projected/b97ffca4-e4a1-4fbf-8271-d97410ffa49a-kube-api-access-bbhdq\") pod \"cinder-operator-controller-manager-8d58dc466-gdxrh\" (UID: \"b97ffca4-e4a1-4fbf-8271-d97410ffa49a\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-gdxrh" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.133591 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k7fb\" (UniqueName: \"kubernetes.io/projected/99111621-16af-4be2-b4d4-ce9b82e41165-kube-api-access-9k7fb\") pod \"barbican-operator-controller-manager-59bc569d95-bbhfb\" (UID: \"99111621-16af-4be2-b4d4-ce9b82e41165\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-bbhfb" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.133616 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgvpm\" (UniqueName: \"kubernetes.io/projected/26434b2d-c04d-42b7-9631-6d0851886141-kube-api-access-lgvpm\") pod \"designate-operator-controller-manager-588d4d986b-mfsfk\" (UID: \"26434b2d-c04d-42b7-9631-6d0851886141\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mfsfk" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.140031 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-gdxrh"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.149810 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-mfsfk"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.185740 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-t97b7"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.186454 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-t97b7" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.190153 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-j5825" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.203711 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-t97b7"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.221228 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-9t9xf"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.222341 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-9t9xf" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.223245 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bpmp"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.225756 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-f2sg8" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.229057 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bpmp" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.229212 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-9t9xf"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.232550 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-qbs4x" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.235475 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbhdq\" (UniqueName: \"kubernetes.io/projected/b97ffca4-e4a1-4fbf-8271-d97410ffa49a-kube-api-access-bbhdq\") pod \"cinder-operator-controller-manager-8d58dc466-gdxrh\" (UID: \"b97ffca4-e4a1-4fbf-8271-d97410ffa49a\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-gdxrh" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.235519 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k7fb\" (UniqueName: \"kubernetes.io/projected/99111621-16af-4be2-b4d4-ce9b82e41165-kube-api-access-9k7fb\") pod \"barbican-operator-controller-manager-59bc569d95-bbhfb\" (UID: \"99111621-16af-4be2-b4d4-ce9b82e41165\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-bbhfb" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.235545 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgvpm\" (UniqueName: \"kubernetes.io/projected/26434b2d-c04d-42b7-9631-6d0851886141-kube-api-access-lgvpm\") pod \"designate-operator-controller-manager-588d4d986b-mfsfk\" (UID: \"26434b2d-c04d-42b7-9631-6d0851886141\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mfsfk" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.249112 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bpmp"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.256268 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5d9899ccc6-2x44r"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.257158 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5d9899ccc6-2x44r" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.260195 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-wh9bk" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.260389 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.262539 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k7fb\" (UniqueName: \"kubernetes.io/projected/99111621-16af-4be2-b4d4-ce9b82e41165-kube-api-access-9k7fb\") pod \"barbican-operator-controller-manager-59bc569d95-bbhfb\" (UID: \"99111621-16af-4be2-b4d4-ce9b82e41165\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-bbhfb" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.264418 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgvpm\" (UniqueName: \"kubernetes.io/projected/26434b2d-c04d-42b7-9631-6d0851886141-kube-api-access-lgvpm\") pod \"designate-operator-controller-manager-588d4d986b-mfsfk\" (UID: \"26434b2d-c04d-42b7-9631-6d0851886141\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mfsfk" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.268342 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbhdq\" (UniqueName: \"kubernetes.io/projected/b97ffca4-e4a1-4fbf-8271-d97410ffa49a-kube-api-access-bbhdq\") pod \"cinder-operator-controller-manager-8d58dc466-gdxrh\" (UID: \"b97ffca4-e4a1-4fbf-8271-d97410ffa49a\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-gdxrh" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.268401 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-5jlcc"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.269174 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-5jlcc" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.277624 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-hx2vg" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.292727 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5d9899ccc6-2x44r"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.325327 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-5jlcc"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.339796 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjsx6\" (UniqueName: \"kubernetes.io/projected/5af1049c-beed-4d2a-93da-95171c0142e3-kube-api-access-tjsx6\") pod \"infra-operator-controller-manager-5d9899ccc6-2x44r\" (UID: \"5af1049c-beed-4d2a-93da-95171c0142e3\") " pod="openstack-operators/infra-operator-controller-manager-5d9899ccc6-2x44r" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.340322 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zwtc\" (UniqueName: \"kubernetes.io/projected/a56dfc81-4a0f-4e99-a884-cff054d164b9-kube-api-access-4zwtc\") pod \"horizon-operator-controller-manager-8464cc45fb-7bpmp\" (UID: \"a56dfc81-4a0f-4e99-a884-cff054d164b9\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bpmp" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.340359 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gq8m\" (UniqueName: \"kubernetes.io/projected/cee7836b-e12f-4de9-be6b-4caa60294269-kube-api-access-6gq8m\") pod \"ironic-operator-controller-manager-6f787dddc9-5jlcc\" (UID: \"cee7836b-e12f-4de9-be6b-4caa60294269\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-5jlcc" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.340395 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5af1049c-beed-4d2a-93da-95171c0142e3-cert\") pod \"infra-operator-controller-manager-5d9899ccc6-2x44r\" (UID: \"5af1049c-beed-4d2a-93da-95171c0142e3\") " pod="openstack-operators/infra-operator-controller-manager-5d9899ccc6-2x44r" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.340428 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58294\" (UniqueName: \"kubernetes.io/projected/88958bd4-4087-4f7c-b72e-9c2cea412993-kube-api-access-58294\") pod \"heat-operator-controller-manager-67dd5f86f5-9t9xf\" (UID: \"88958bd4-4087-4f7c-b72e-9c2cea412993\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-9t9xf" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.340461 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx75r\" (UniqueName: \"kubernetes.io/projected/06c975b5-ec27-4ff9-b7bb-115c12275ac2-kube-api-access-kx75r\") pod \"glance-operator-controller-manager-79df6bcc97-t97b7\" (UID: \"06c975b5-ec27-4ff9-b7bb-115c12275ac2\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-t97b7" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.363885 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-fpgkx"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.364751 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpgkx" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.400336 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-x4tww" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.402078 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-fpgkx"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.427761 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-bbhfb" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.433010 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-zpwsq"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.438777 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpwsq" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.441449 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gq8m\" (UniqueName: \"kubernetes.io/projected/cee7836b-e12f-4de9-be6b-4caa60294269-kube-api-access-6gq8m\") pod \"ironic-operator-controller-manager-6f787dddc9-5jlcc\" (UID: \"cee7836b-e12f-4de9-be6b-4caa60294269\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-5jlcc" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.441502 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5af1049c-beed-4d2a-93da-95171c0142e3-cert\") pod \"infra-operator-controller-manager-5d9899ccc6-2x44r\" (UID: \"5af1049c-beed-4d2a-93da-95171c0142e3\") " pod="openstack-operators/infra-operator-controller-manager-5d9899ccc6-2x44r" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.441535 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58294\" (UniqueName: \"kubernetes.io/projected/88958bd4-4087-4f7c-b72e-9c2cea412993-kube-api-access-58294\") pod \"heat-operator-controller-manager-67dd5f86f5-9t9xf\" (UID: \"88958bd4-4087-4f7c-b72e-9c2cea412993\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-9t9xf" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.441559 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j688\" (UniqueName: \"kubernetes.io/projected/a0cd89a4-110c-4df5-b9ce-186f38d9be30-kube-api-access-4j688\") pod \"keystone-operator-controller-manager-768b96df4c-fpgkx\" (UID: \"a0cd89a4-110c-4df5-b9ce-186f38d9be30\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpgkx" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.441593 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx75r\" (UniqueName: \"kubernetes.io/projected/06c975b5-ec27-4ff9-b7bb-115c12275ac2-kube-api-access-kx75r\") pod \"glance-operator-controller-manager-79df6bcc97-t97b7\" (UID: \"06c975b5-ec27-4ff9-b7bb-115c12275ac2\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-t97b7" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.441643 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjsx6\" (UniqueName: \"kubernetes.io/projected/5af1049c-beed-4d2a-93da-95171c0142e3-kube-api-access-tjsx6\") pod \"infra-operator-controller-manager-5d9899ccc6-2x44r\" (UID: \"5af1049c-beed-4d2a-93da-95171c0142e3\") " pod="openstack-operators/infra-operator-controller-manager-5d9899ccc6-2x44r" Mar 20 07:30:14 crc kubenswrapper[4749]: E0320 07:30:14.443030 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 07:30:14 crc kubenswrapper[4749]: E0320 07:30:14.443094 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5af1049c-beed-4d2a-93da-95171c0142e3-cert podName:5af1049c-beed-4d2a-93da-95171c0142e3 nodeName:}" failed. No retries permitted until 2026-03-20 07:30:14.9430788 +0000 UTC m=+1051.492736447 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5af1049c-beed-4d2a-93da-95171c0142e3-cert") pod "infra-operator-controller-manager-5d9899ccc6-2x44r" (UID: "5af1049c-beed-4d2a-93da-95171c0142e3") : secret "infra-operator-webhook-server-cert" not found Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.444738 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdl4h\" (UniqueName: \"kubernetes.io/projected/fa996ed9-64cd-4371-80e7-8122c77285fc-kube-api-access-tdl4h\") pod \"manila-operator-controller-manager-55f864c847-zpwsq\" (UID: \"fa996ed9-64cd-4371-80e7-8122c77285fc\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpwsq" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.444785 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zwtc\" (UniqueName: \"kubernetes.io/projected/a56dfc81-4a0f-4e99-a884-cff054d164b9-kube-api-access-4zwtc\") pod \"horizon-operator-controller-manager-8464cc45fb-7bpmp\" (UID: \"a56dfc81-4a0f-4e99-a884-cff054d164b9\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bpmp" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.447097 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-9j558" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.448276 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-gdxrh" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.458721 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mfsfk" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.465561 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58294\" (UniqueName: \"kubernetes.io/projected/88958bd4-4087-4f7c-b72e-9c2cea412993-kube-api-access-58294\") pod \"heat-operator-controller-manager-67dd5f86f5-9t9xf\" (UID: \"88958bd4-4087-4f7c-b72e-9c2cea412993\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-9t9xf" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.471774 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx75r\" (UniqueName: \"kubernetes.io/projected/06c975b5-ec27-4ff9-b7bb-115c12275ac2-kube-api-access-kx75r\") pod \"glance-operator-controller-manager-79df6bcc97-t97b7\" (UID: \"06c975b5-ec27-4ff9-b7bb-115c12275ac2\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-t97b7" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.471967 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gq8m\" (UniqueName: \"kubernetes.io/projected/cee7836b-e12f-4de9-be6b-4caa60294269-kube-api-access-6gq8m\") pod \"ironic-operator-controller-manager-6f787dddc9-5jlcc\" (UID: \"cee7836b-e12f-4de9-be6b-4caa60294269\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-5jlcc" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.473985 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zwtc\" (UniqueName: \"kubernetes.io/projected/a56dfc81-4a0f-4e99-a884-cff054d164b9-kube-api-access-4zwtc\") pod \"horizon-operator-controller-manager-8464cc45fb-7bpmp\" (UID: \"a56dfc81-4a0f-4e99-a884-cff054d164b9\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bpmp" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.476421 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-mscpf"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.477499 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-mscpf" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.478386 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjsx6\" (UniqueName: \"kubernetes.io/projected/5af1049c-beed-4d2a-93da-95171c0142e3-kube-api-access-tjsx6\") pod \"infra-operator-controller-manager-5d9899ccc6-2x44r\" (UID: \"5af1049c-beed-4d2a-93da-95171c0142e3\") " pod="openstack-operators/infra-operator-controller-manager-5d9899ccc6-2x44r" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.480220 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-nkm6w" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.513818 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-t97b7" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.525107 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-zpwsq"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.530014 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-mscpf"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.535115 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-548s6"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.536093 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-548s6" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.539248 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-kmkkw" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.545560 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj4rk\" (UniqueName: \"kubernetes.io/projected/6b2dc985-5b75-4bc6-8c79-392034f38960-kube-api-access-mj4rk\") pod \"mariadb-operator-controller-manager-67ccfc9778-mscpf\" (UID: \"6b2dc985-5b75-4bc6-8c79-392034f38960\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-mscpf" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.545653 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdl4h\" (UniqueName: \"kubernetes.io/projected/fa996ed9-64cd-4371-80e7-8122c77285fc-kube-api-access-tdl4h\") pod \"manila-operator-controller-manager-55f864c847-zpwsq\" (UID: \"fa996ed9-64cd-4371-80e7-8122c77285fc\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpwsq" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.545705 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqmns\" (UniqueName: \"kubernetes.io/projected/cd82533b-5f9e-45e3-a645-90e678bcbf4a-kube-api-access-zqmns\") pod \"neutron-operator-controller-manager-767865f676-548s6\" (UID: \"cd82533b-5f9e-45e3-a645-90e678bcbf4a\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-548s6" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.545726 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j688\" (UniqueName: \"kubernetes.io/projected/a0cd89a4-110c-4df5-b9ce-186f38d9be30-kube-api-access-4j688\") pod \"keystone-operator-controller-manager-768b96df4c-fpgkx\" (UID: \"a0cd89a4-110c-4df5-b9ce-186f38d9be30\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpgkx" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.549458 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-9t9xf" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.555497 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-6kn5g"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.556371 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-6kn5g" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.560611 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-bt59w" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.563521 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-548s6"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.565665 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdl4h\" (UniqueName: \"kubernetes.io/projected/fa996ed9-64cd-4371-80e7-8122c77285fc-kube-api-access-tdl4h\") pod \"manila-operator-controller-manager-55f864c847-zpwsq\" (UID: \"fa996ed9-64cd-4371-80e7-8122c77285fc\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpwsq" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.570882 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j688\" (UniqueName: \"kubernetes.io/projected/a0cd89a4-110c-4df5-b9ce-186f38d9be30-kube-api-access-4j688\") pod \"keystone-operator-controller-manager-768b96df4c-fpgkx\" (UID: \"a0cd89a4-110c-4df5-b9ce-186f38d9be30\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpgkx" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.585376 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-6kn5g"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.624158 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bpmp" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.633383 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-dhsjc"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.634303 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-dhsjc" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.636142 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-kmvkl" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.642067 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-dhsjc"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.647347 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6wh2\" (UniqueName: \"kubernetes.io/projected/a0e5f3af-b138-43f6-b007-ca56ec51851c-kube-api-access-p6wh2\") pod \"nova-operator-controller-manager-5d488d59fb-6kn5g\" (UID: \"a0e5f3af-b138-43f6-b007-ca56ec51851c\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-6kn5g" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.647410 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6lqh\" (UniqueName: \"kubernetes.io/projected/f5230399-dbb2-4a03-afcb-58dd2c1fdd22-kube-api-access-t6lqh\") pod \"octavia-operator-controller-manager-5b9f45d989-dhsjc\" (UID: \"f5230399-dbb2-4a03-afcb-58dd2c1fdd22\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-dhsjc" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.647461 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqmns\" (UniqueName: \"kubernetes.io/projected/cd82533b-5f9e-45e3-a645-90e678bcbf4a-kube-api-access-zqmns\") pod \"neutron-operator-controller-manager-767865f676-548s6\" (UID: \"cd82533b-5f9e-45e3-a645-90e678bcbf4a\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-548s6" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.647528 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj4rk\" (UniqueName: \"kubernetes.io/projected/6b2dc985-5b75-4bc6-8c79-392034f38960-kube-api-access-mj4rk\") pod \"mariadb-operator-controller-manager-67ccfc9778-mscpf\" (UID: \"6b2dc985-5b75-4bc6-8c79-392034f38960\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-mscpf" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.658087 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4qwrj"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.658961 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.661781 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.664601 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqmns\" (UniqueName: \"kubernetes.io/projected/cd82533b-5f9e-45e3-a645-90e678bcbf4a-kube-api-access-zqmns\") pod \"neutron-operator-controller-manager-767865f676-548s6\" (UID: \"cd82533b-5f9e-45e3-a645-90e678bcbf4a\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-548s6" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.664698 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-2pnpk" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.665444 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-5jlcc" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.666343 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj4rk\" (UniqueName: \"kubernetes.io/projected/6b2dc985-5b75-4bc6-8c79-392034f38960-kube-api-access-mj4rk\") pod \"mariadb-operator-controller-manager-67ccfc9778-mscpf\" (UID: \"6b2dc985-5b75-4bc6-8c79-392034f38960\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-mscpf" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.723704 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpgkx" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.734155 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-kmzst"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.735934 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-kmzst" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.741744 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-qcxg7" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.751391 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba67eb0-3c0d-4558-b603-3626f3980dad-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-4qwrj\" (UID: \"8ba67eb0-3c0d-4558-b603-3626f3980dad\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.751461 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrjsh\" (UniqueName: \"kubernetes.io/projected/b4c290c3-309d-4706-935a-0e33bf4e403b-kube-api-access-lrjsh\") pod \"ovn-operator-controller-manager-884679f54-kmzst\" (UID: \"b4c290c3-309d-4706-935a-0e33bf4e403b\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-kmzst" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.751503 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6wh2\" (UniqueName: \"kubernetes.io/projected/a0e5f3af-b138-43f6-b007-ca56ec51851c-kube-api-access-p6wh2\") pod \"nova-operator-controller-manager-5d488d59fb-6kn5g\" (UID: \"a0e5f3af-b138-43f6-b007-ca56ec51851c\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-6kn5g" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.751537 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6lqh\" (UniqueName: \"kubernetes.io/projected/f5230399-dbb2-4a03-afcb-58dd2c1fdd22-kube-api-access-t6lqh\") pod \"octavia-operator-controller-manager-5b9f45d989-dhsjc\" (UID: \"f5230399-dbb2-4a03-afcb-58dd2c1fdd22\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-dhsjc" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.751554 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r64t\" (UniqueName: \"kubernetes.io/projected/8ba67eb0-3c0d-4558-b603-3626f3980dad-kube-api-access-7r64t\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-4qwrj\" (UID: \"8ba67eb0-3c0d-4558-b603-3626f3980dad\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.753111 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-nqz8s"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.753970 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-nqz8s" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.756184 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-9cbqb" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.782967 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-kmzst"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.787057 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6lqh\" (UniqueName: \"kubernetes.io/projected/f5230399-dbb2-4a03-afcb-58dd2c1fdd22-kube-api-access-t6lqh\") pod \"octavia-operator-controller-manager-5b9f45d989-dhsjc\" (UID: \"f5230399-dbb2-4a03-afcb-58dd2c1fdd22\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-dhsjc" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.788577 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6wh2\" (UniqueName: \"kubernetes.io/projected/a0e5f3af-b138-43f6-b007-ca56ec51851c-kube-api-access-p6wh2\") pod \"nova-operator-controller-manager-5d488d59fb-6kn5g\" (UID: \"a0e5f3af-b138-43f6-b007-ca56ec51851c\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-6kn5g" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.819919 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-nqz8s"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.830817 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4qwrj"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.840300 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-289b6"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.841538 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpwsq" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.843204 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-289b6" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.847838 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-kdz48" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.848018 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-289b6"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.852587 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrjsh\" (UniqueName: \"kubernetes.io/projected/b4c290c3-309d-4706-935a-0e33bf4e403b-kube-api-access-lrjsh\") pod \"ovn-operator-controller-manager-884679f54-kmzst\" (UID: \"b4c290c3-309d-4706-935a-0e33bf4e403b\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-kmzst" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.852647 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m6vw\" (UniqueName: \"kubernetes.io/projected/a15a8919-b4d7-418a-b725-38e7d7b0e859-kube-api-access-7m6vw\") pod \"swift-operator-controller-manager-c674c5965-289b6\" (UID: \"a15a8919-b4d7-418a-b725-38e7d7b0e859\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-289b6" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.852683 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc6l2\" (UniqueName: \"kubernetes.io/projected/9d6d1c42-480e-49ac-8a40-233fb95e4a0a-kube-api-access-wc6l2\") pod \"placement-operator-controller-manager-5784578c99-nqz8s\" (UID: \"9d6d1c42-480e-49ac-8a40-233fb95e4a0a\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-nqz8s" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.852702 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r64t\" (UniqueName: \"kubernetes.io/projected/8ba67eb0-3c0d-4558-b603-3626f3980dad-kube-api-access-7r64t\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-4qwrj\" (UID: \"8ba67eb0-3c0d-4558-b603-3626f3980dad\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.852751 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba67eb0-3c0d-4558-b603-3626f3980dad-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-4qwrj\" (UID: \"8ba67eb0-3c0d-4558-b603-3626f3980dad\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" Mar 20 07:30:14 crc kubenswrapper[4749]: E0320 07:30:14.852870 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 07:30:14 crc kubenswrapper[4749]: E0320 07:30:14.852914 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ba67eb0-3c0d-4558-b603-3626f3980dad-cert podName:8ba67eb0-3c0d-4558-b603-3626f3980dad nodeName:}" failed. No retries permitted until 2026-03-20 07:30:15.352900447 +0000 UTC m=+1051.902558094 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ba67eb0-3c0d-4558-b603-3626f3980dad-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" (UID: "8ba67eb0-3c0d-4558-b603-3626f3980dad") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.864618 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-mscpf" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.874836 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r64t\" (UniqueName: \"kubernetes.io/projected/8ba67eb0-3c0d-4558-b603-3626f3980dad-kube-api-access-7r64t\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-4qwrj\" (UID: \"8ba67eb0-3c0d-4558-b603-3626f3980dad\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.879055 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-548s6" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.883919 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-dq6v4"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.884109 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrjsh\" (UniqueName: \"kubernetes.io/projected/b4c290c3-309d-4706-935a-0e33bf4e403b-kube-api-access-lrjsh\") pod \"ovn-operator-controller-manager-884679f54-kmzst\" (UID: \"b4c290c3-309d-4706-935a-0e33bf4e403b\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-kmzst" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.884939 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dq6v4" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.890157 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-j2z4x" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.891225 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-dq6v4"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.910455 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p867r"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.912148 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p867r" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.918839 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-n2rw2" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.919580 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p867r"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.927308 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-6kn5g" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.954250 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc6l2\" (UniqueName: \"kubernetes.io/projected/9d6d1c42-480e-49ac-8a40-233fb95e4a0a-kube-api-access-wc6l2\") pod \"placement-operator-controller-manager-5784578c99-nqz8s\" (UID: \"9d6d1c42-480e-49ac-8a40-233fb95e4a0a\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-nqz8s" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.954333 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5af1049c-beed-4d2a-93da-95171c0142e3-cert\") pod \"infra-operator-controller-manager-5d9899ccc6-2x44r\" (UID: \"5af1049c-beed-4d2a-93da-95171c0142e3\") " pod="openstack-operators/infra-operator-controller-manager-5d9899ccc6-2x44r" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.954428 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m6vw\" (UniqueName: \"kubernetes.io/projected/a15a8919-b4d7-418a-b725-38e7d7b0e859-kube-api-access-7m6vw\") pod \"swift-operator-controller-manager-c674c5965-289b6\" (UID: \"a15a8919-b4d7-418a-b725-38e7d7b0e859\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-289b6" Mar 20 07:30:14 crc kubenswrapper[4749]: E0320 07:30:14.955054 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 07:30:14 crc kubenswrapper[4749]: E0320 07:30:14.955106 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5af1049c-beed-4d2a-93da-95171c0142e3-cert podName:5af1049c-beed-4d2a-93da-95171c0142e3 nodeName:}" failed. No retries permitted until 2026-03-20 07:30:15.955084652 +0000 UTC m=+1052.504742299 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5af1049c-beed-4d2a-93da-95171c0142e3-cert") pod "infra-operator-controller-manager-5d9899ccc6-2x44r" (UID: "5af1049c-beed-4d2a-93da-95171c0142e3") : secret "infra-operator-webhook-server-cert" not found Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.964751 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6pbbk"] Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.967516 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6pbbk" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.974321 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-fzpbh" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.977008 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc6l2\" (UniqueName: \"kubernetes.io/projected/9d6d1c42-480e-49ac-8a40-233fb95e4a0a-kube-api-access-wc6l2\") pod \"placement-operator-controller-manager-5784578c99-nqz8s\" (UID: \"9d6d1c42-480e-49ac-8a40-233fb95e4a0a\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-nqz8s" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.979902 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m6vw\" (UniqueName: \"kubernetes.io/projected/a15a8919-b4d7-418a-b725-38e7d7b0e859-kube-api-access-7m6vw\") pod \"swift-operator-controller-manager-c674c5965-289b6\" (UID: \"a15a8919-b4d7-418a-b725-38e7d7b0e859\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-289b6" Mar 20 07:30:14 crc kubenswrapper[4749]: I0320 07:30:14.986028 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6pbbk"] Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.043891 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-dhsjc" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.045382 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-bbhfb"] Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.055629 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttmr5\" (UniqueName: \"kubernetes.io/projected/7161b86e-8178-40db-a6a3-71f724746aed-kube-api-access-ttmr5\") pod \"telemetry-operator-controller-manager-d6b694c5-dq6v4\" (UID: \"7161b86e-8178-40db-a6a3-71f724746aed\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dq6v4" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.055681 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb46c\" (UniqueName: \"kubernetes.io/projected/75ab6716-99ca-4fd9-a632-0bc69d5c3742-kube-api-access-fb46c\") pod \"test-operator-controller-manager-5c5cb9c4d7-p867r\" (UID: \"75ab6716-99ca-4fd9-a632-0bc69d5c3742\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p867r" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.076353 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78"] Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.077263 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.079366 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-kmzst" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.081684 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.081959 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-wb5rq" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.082063 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.093105 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78"] Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.108701 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-nqz8s" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.117091 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-86v48"] Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.118022 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-86v48" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.124704 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-fjqn2" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.126045 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-86v48"] Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.133365 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-mfsfk"] Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.149788 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-gdxrh"] Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.156806 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttmr5\" (UniqueName: \"kubernetes.io/projected/7161b86e-8178-40db-a6a3-71f724746aed-kube-api-access-ttmr5\") pod \"telemetry-operator-controller-manager-d6b694c5-dq6v4\" (UID: \"7161b86e-8178-40db-a6a3-71f724746aed\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dq6v4" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.156859 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb46c\" (UniqueName: \"kubernetes.io/projected/75ab6716-99ca-4fd9-a632-0bc69d5c3742-kube-api-access-fb46c\") pod \"test-operator-controller-manager-5c5cb9c4d7-p867r\" (UID: \"75ab6716-99ca-4fd9-a632-0bc69d5c3742\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p867r" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.156959 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqv5w\" (UniqueName: \"kubernetes.io/projected/ddbd9da2-a48e-4e49-894e-4a9ae1109a73-kube-api-access-pqv5w\") pod \"watcher-operator-controller-manager-6c4d75f7f9-6pbbk\" (UID: \"ddbd9da2-a48e-4e49-894e-4a9ae1109a73\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6pbbk" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.187830 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-289b6" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.199161 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttmr5\" (UniqueName: \"kubernetes.io/projected/7161b86e-8178-40db-a6a3-71f724746aed-kube-api-access-ttmr5\") pod \"telemetry-operator-controller-manager-d6b694c5-dq6v4\" (UID: \"7161b86e-8178-40db-a6a3-71f724746aed\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dq6v4" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.202387 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb46c\" (UniqueName: \"kubernetes.io/projected/75ab6716-99ca-4fd9-a632-0bc69d5c3742-kube-api-access-fb46c\") pod \"test-operator-controller-manager-5c5cb9c4d7-p867r\" (UID: \"75ab6716-99ca-4fd9-a632-0bc69d5c3742\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p867r" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.212126 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dq6v4" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.213119 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-9t9xf"] Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.250085 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p867r" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.257904 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-webhook-certs\") pod \"openstack-operator-controller-manager-65c7c8696f-s7w78\" (UID: \"461831bb-9c93-49f8-a32e-ec01c4bdc549\") " pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.257955 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fmvn\" (UniqueName: \"kubernetes.io/projected/589f626e-af46-4f5e-98f6-d4ad787f84d8-kube-api-access-6fmvn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-86v48\" (UID: \"589f626e-af46-4f5e-98f6-d4ad787f84d8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-86v48" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.258001 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-metrics-certs\") pod \"openstack-operator-controller-manager-65c7c8696f-s7w78\" (UID: \"461831bb-9c93-49f8-a32e-ec01c4bdc549\") " pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.258027 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqv5w\" (UniqueName: \"kubernetes.io/projected/ddbd9da2-a48e-4e49-894e-4a9ae1109a73-kube-api-access-pqv5w\") pod \"watcher-operator-controller-manager-6c4d75f7f9-6pbbk\" (UID: \"ddbd9da2-a48e-4e49-894e-4a9ae1109a73\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6pbbk" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.258080 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqbs5\" (UniqueName: \"kubernetes.io/projected/461831bb-9c93-49f8-a32e-ec01c4bdc549-kube-api-access-zqbs5\") pod \"openstack-operator-controller-manager-65c7c8696f-s7w78\" (UID: \"461831bb-9c93-49f8-a32e-ec01c4bdc549\") " pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.294908 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqv5w\" (UniqueName: \"kubernetes.io/projected/ddbd9da2-a48e-4e49-894e-4a9ae1109a73-kube-api-access-pqv5w\") pod \"watcher-operator-controller-manager-6c4d75f7f9-6pbbk\" (UID: \"ddbd9da2-a48e-4e49-894e-4a9ae1109a73\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6pbbk" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.295346 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6pbbk" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.359702 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqbs5\" (UniqueName: \"kubernetes.io/projected/461831bb-9c93-49f8-a32e-ec01c4bdc549-kube-api-access-zqbs5\") pod \"openstack-operator-controller-manager-65c7c8696f-s7w78\" (UID: \"461831bb-9c93-49f8-a32e-ec01c4bdc549\") " pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.359771 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-webhook-certs\") pod \"openstack-operator-controller-manager-65c7c8696f-s7w78\" (UID: \"461831bb-9c93-49f8-a32e-ec01c4bdc549\") " pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.359792 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fmvn\" (UniqueName: \"kubernetes.io/projected/589f626e-af46-4f5e-98f6-d4ad787f84d8-kube-api-access-6fmvn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-86v48\" (UID: \"589f626e-af46-4f5e-98f6-d4ad787f84d8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-86v48" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.359840 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-metrics-certs\") pod \"openstack-operator-controller-manager-65c7c8696f-s7w78\" (UID: \"461831bb-9c93-49f8-a32e-ec01c4bdc549\") " pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.359887 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba67eb0-3c0d-4558-b603-3626f3980dad-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-4qwrj\" (UID: \"8ba67eb0-3c0d-4558-b603-3626f3980dad\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" Mar 20 07:30:15 crc kubenswrapper[4749]: E0320 07:30:15.359992 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 07:30:15 crc kubenswrapper[4749]: E0320 07:30:15.360036 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ba67eb0-3c0d-4558-b603-3626f3980dad-cert podName:8ba67eb0-3c0d-4558-b603-3626f3980dad nodeName:}" failed. No retries permitted until 2026-03-20 07:30:16.36002187 +0000 UTC m=+1052.909679517 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ba67eb0-3c0d-4558-b603-3626f3980dad-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" (UID: "8ba67eb0-3c0d-4558-b603-3626f3980dad") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 07:30:15 crc kubenswrapper[4749]: E0320 07:30:15.361023 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 07:30:15 crc kubenswrapper[4749]: E0320 07:30:15.361041 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 07:30:15 crc kubenswrapper[4749]: E0320 07:30:15.361078 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-webhook-certs podName:461831bb-9c93-49f8-a32e-ec01c4bdc549 nodeName:}" failed. No retries permitted until 2026-03-20 07:30:15.861060385 +0000 UTC m=+1052.410718032 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-webhook-certs") pod "openstack-operator-controller-manager-65c7c8696f-s7w78" (UID: "461831bb-9c93-49f8-a32e-ec01c4bdc549") : secret "webhook-server-cert" not found Mar 20 07:30:15 crc kubenswrapper[4749]: E0320 07:30:15.361093 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-metrics-certs podName:461831bb-9c93-49f8-a32e-ec01c4bdc549 nodeName:}" failed. No retries permitted until 2026-03-20 07:30:15.861086276 +0000 UTC m=+1052.410743923 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-metrics-certs") pod "openstack-operator-controller-manager-65c7c8696f-s7w78" (UID: "461831bb-9c93-49f8-a32e-ec01c4bdc549") : secret "metrics-server-cert" not found Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.385376 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fmvn\" (UniqueName: \"kubernetes.io/projected/589f626e-af46-4f5e-98f6-d4ad787f84d8-kube-api-access-6fmvn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-86v48\" (UID: \"589f626e-af46-4f5e-98f6-d4ad787f84d8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-86v48" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.385912 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqbs5\" (UniqueName: \"kubernetes.io/projected/461831bb-9c93-49f8-a32e-ec01c4bdc549-kube-api-access-zqbs5\") pod \"openstack-operator-controller-manager-65c7c8696f-s7w78\" (UID: \"461831bb-9c93-49f8-a32e-ec01c4bdc549\") " pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.414819 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-t97b7"] Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.463322 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-86v48" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.615834 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-9t9xf" event={"ID":"88958bd4-4087-4f7c-b72e-9c2cea412993","Type":"ContainerStarted","Data":"36fd9d658209b9154ac3d32f46855774cd68580f61cb3435cea91f6a4bf58fc1"} Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.617346 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mfsfk" event={"ID":"26434b2d-c04d-42b7-9631-6d0851886141","Type":"ContainerStarted","Data":"e38a5164e895027828f4fdbe2f78dc325a7c34231473a0cf07df4f5620de009f"} Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.620685 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-bbhfb" event={"ID":"99111621-16af-4be2-b4d4-ce9b82e41165","Type":"ContainerStarted","Data":"4549f004c5af7ac6b439a641c4111ae807898f23dea67e81908156fb0092c715"} Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.634022 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bpmp"] Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.641678 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-gdxrh" event={"ID":"b97ffca4-e4a1-4fbf-8271-d97410ffa49a","Type":"ContainerStarted","Data":"c496285621f301939a113e9e9c44c21441c7623296c49266f5a662b4d184b93a"} Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.659138 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-t97b7" event={"ID":"06c975b5-ec27-4ff9-b7bb-115c12275ac2","Type":"ContainerStarted","Data":"3cafba997f4254396c71fda069f2bf1513eedfc7019dc9dbc03291c154cb4461"} Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.703185 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-fpgkx"] Mar 20 07:30:15 crc kubenswrapper[4749]: W0320 07:30:15.711219 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcee7836b_e12f_4de9_be6b_4caa60294269.slice/crio-f7b9b6e62bba74406d55065ad59e4d5e093b4fcafc012d8aa13f23ddd91fb96b WatchSource:0}: Error finding container f7b9b6e62bba74406d55065ad59e4d5e093b4fcafc012d8aa13f23ddd91fb96b: Status 404 returned error can't find the container with id f7b9b6e62bba74406d55065ad59e4d5e093b4fcafc012d8aa13f23ddd91fb96b Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.720171 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-5jlcc"] Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.745634 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-548s6"] Mar 20 07:30:15 crc kubenswrapper[4749]: W0320 07:30:15.839573 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa996ed9_64cd_4371_80e7_8122c77285fc.slice/crio-ad09336a5e35ba49d4a33f2f21705488d75be34938b426a1f0dd40a0a46a406a WatchSource:0}: Error finding container ad09336a5e35ba49d4a33f2f21705488d75be34938b426a1f0dd40a0a46a406a: Status 404 returned error can't find the container with id ad09336a5e35ba49d4a33f2f21705488d75be34938b426a1f0dd40a0a46a406a Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.841397 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-zpwsq"] Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.868751 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-metrics-certs\") pod \"openstack-operator-controller-manager-65c7c8696f-s7w78\" (UID: \"461831bb-9c93-49f8-a32e-ec01c4bdc549\") " pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.868910 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-webhook-certs\") pod \"openstack-operator-controller-manager-65c7c8696f-s7w78\" (UID: \"461831bb-9c93-49f8-a32e-ec01c4bdc549\") " pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:30:15 crc kubenswrapper[4749]: E0320 07:30:15.868953 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 07:30:15 crc kubenswrapper[4749]: E0320 07:30:15.869029 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-metrics-certs podName:461831bb-9c93-49f8-a32e-ec01c4bdc549 nodeName:}" failed. No retries permitted until 2026-03-20 07:30:16.869010489 +0000 UTC m=+1053.418668136 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-metrics-certs") pod "openstack-operator-controller-manager-65c7c8696f-s7w78" (UID: "461831bb-9c93-49f8-a32e-ec01c4bdc549") : secret "metrics-server-cert" not found Mar 20 07:30:15 crc kubenswrapper[4749]: E0320 07:30:15.869065 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 07:30:15 crc kubenswrapper[4749]: E0320 07:30:15.869130 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-webhook-certs podName:461831bb-9c93-49f8-a32e-ec01c4bdc549 nodeName:}" failed. No retries permitted until 2026-03-20 07:30:16.869113601 +0000 UTC m=+1053.418771248 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-webhook-certs") pod "openstack-operator-controller-manager-65c7c8696f-s7w78" (UID: "461831bb-9c93-49f8-a32e-ec01c4bdc549") : secret "webhook-server-cert" not found Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.882304 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-mscpf"] Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.888602 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-6kn5g"] Mar 20 07:30:15 crc kubenswrapper[4749]: W0320 07:30:15.892002 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b2dc985_5b75_4bc6_8c79_392034f38960.slice/crio-c331e60672eed4d00aa32fd22528da39bea4db6d556ad982c5e6f140172a1539 WatchSource:0}: Error finding container c331e60672eed4d00aa32fd22528da39bea4db6d556ad982c5e6f140172a1539: Status 404 returned error can't find the container with id c331e60672eed4d00aa32fd22528da39bea4db6d556ad982c5e6f140172a1539 Mar 20 07:30:15 crc kubenswrapper[4749]: W0320 07:30:15.895138 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0e5f3af_b138_43f6_b007_ca56ec51851c.slice/crio-93979f07492b5c79fdfe888e2126c4d02c49adc3a63c8f8faf09a7914236949c WatchSource:0}: Error finding container 93979f07492b5c79fdfe888e2126c4d02c49adc3a63c8f8faf09a7914236949c: Status 404 returned error can't find the container with id 93979f07492b5c79fdfe888e2126c4d02c49adc3a63c8f8faf09a7914236949c Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.970155 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5af1049c-beed-4d2a-93da-95171c0142e3-cert\") pod \"infra-operator-controller-manager-5d9899ccc6-2x44r\" (UID: \"5af1049c-beed-4d2a-93da-95171c0142e3\") " pod="openstack-operators/infra-operator-controller-manager-5d9899ccc6-2x44r" Mar 20 07:30:15 crc kubenswrapper[4749]: E0320 07:30:15.970497 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 07:30:15 crc kubenswrapper[4749]: E0320 07:30:15.970663 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5af1049c-beed-4d2a-93da-95171c0142e3-cert podName:5af1049c-beed-4d2a-93da-95171c0142e3 nodeName:}" failed. No retries permitted until 2026-03-20 07:30:17.97063655 +0000 UTC m=+1054.520294227 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5af1049c-beed-4d2a-93da-95171c0142e3-cert") pod "infra-operator-controller-manager-5d9899ccc6-2x44r" (UID: "5af1049c-beed-4d2a-93da-95171c0142e3") : secret "infra-operator-webhook-server-cert" not found Mar 20 07:30:15 crc kubenswrapper[4749]: I0320 07:30:15.994142 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-dhsjc"] Mar 20 07:30:16 crc kubenswrapper[4749]: W0320 07:30:16.025691 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5230399_dbb2_4a03_afcb_58dd2c1fdd22.slice/crio-b5342260bcac3f4c5eea7c6766996a0ce75409167227dd3a7d5ffd6c904dcd2d WatchSource:0}: Error finding container b5342260bcac3f4c5eea7c6766996a0ce75409167227dd3a7d5ffd6c904dcd2d: Status 404 returned error can't find the container with id b5342260bcac3f4c5eea7c6766996a0ce75409167227dd3a7d5ffd6c904dcd2d Mar 20 07:30:16 crc kubenswrapper[4749]: W0320 07:30:16.066631 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda15a8919_b4d7_418a_b725_38e7d7b0e859.slice/crio-705d0637619201f48b6f9ae88d2d8fc34f9032213e0253dfe4ccd5b3c1dda378 WatchSource:0}: Error finding container 705d0637619201f48b6f9ae88d2d8fc34f9032213e0253dfe4ccd5b3c1dda378: Status 404 returned error can't find the container with id 705d0637619201f48b6f9ae88d2d8fc34f9032213e0253dfe4ccd5b3c1dda378 Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.067172 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-289b6"] Mar 20 07:30:16 crc kubenswrapper[4749]: E0320 07:30:16.069083 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7m6vw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-289b6_openstack-operators(a15a8919-b4d7-418a-b725-38e7d7b0e859): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 07:30:16 crc kubenswrapper[4749]: E0320 07:30:16.070668 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-289b6" podUID="a15a8919-b4d7-418a-b725-38e7d7b0e859" Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.097560 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-nqz8s"] Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.109162 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-86v48"] Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.121001 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-kmzst"] Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.126467 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p867r"] Mar 20 07:30:16 crc kubenswrapper[4749]: E0320 07:30:16.132524 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lrjsh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-kmzst_openstack-operators(b4c290c3-309d-4706-935a-0e33bf4e403b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 07:30:16 crc kubenswrapper[4749]: W0320 07:30:16.132572 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75ab6716_99ca_4fd9_a632_0bc69d5c3742.slice/crio-fb58f7c5d59d1560a80758633f30ef43d824ac5ba807c8c11a5074d7f83782cc WatchSource:0}: Error finding container fb58f7c5d59d1560a80758633f30ef43d824ac5ba807c8c11a5074d7f83782cc: Status 404 returned error can't find the container with id fb58f7c5d59d1560a80758633f30ef43d824ac5ba807c8c11a5074d7f83782cc Mar 20 07:30:16 crc kubenswrapper[4749]: E0320 07:30:16.132640 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6fmvn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-86v48_openstack-operators(589f626e-af46-4f5e-98f6-d4ad787f84d8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 07:30:16 crc kubenswrapper[4749]: E0320 07:30:16.133635 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-kmzst" podUID="b4c290c3-309d-4706-935a-0e33bf4e403b" Mar 20 07:30:16 crc kubenswrapper[4749]: E0320 07:30:16.133699 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-86v48" podUID="589f626e-af46-4f5e-98f6-d4ad787f84d8" Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.141000 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-dq6v4"] Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.163889 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6pbbk"] Mar 20 07:30:16 crc kubenswrapper[4749]: E0320 07:30:16.193558 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ttmr5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-dq6v4_openstack-operators(7161b86e-8178-40db-a6a3-71f724746aed): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 07:30:16 crc kubenswrapper[4749]: E0320 07:30:16.194164 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pqv5w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-6pbbk_openstack-operators(ddbd9da2-a48e-4e49-894e-4a9ae1109a73): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 07:30:16 crc kubenswrapper[4749]: E0320 07:30:16.194420 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fb46c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-p867r_openstack-operators(75ab6716-99ca-4fd9-a632-0bc69d5c3742): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 07:30:16 crc kubenswrapper[4749]: E0320 07:30:16.196113 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p867r" podUID="75ab6716-99ca-4fd9-a632-0bc69d5c3742" Mar 20 07:30:16 crc kubenswrapper[4749]: E0320 07:30:16.197548 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dq6v4" podUID="7161b86e-8178-40db-a6a3-71f724746aed" Mar 20 07:30:16 crc kubenswrapper[4749]: E0320 07:30:16.197607 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6pbbk" podUID="ddbd9da2-a48e-4e49-894e-4a9ae1109a73" Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.380461 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba67eb0-3c0d-4558-b603-3626f3980dad-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-4qwrj\" (UID: \"8ba67eb0-3c0d-4558-b603-3626f3980dad\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" Mar 20 07:30:16 crc kubenswrapper[4749]: E0320 07:30:16.380776 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 07:30:16 crc kubenswrapper[4749]: E0320 07:30:16.380835 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ba67eb0-3c0d-4558-b603-3626f3980dad-cert podName:8ba67eb0-3c0d-4558-b603-3626f3980dad nodeName:}" failed. No retries permitted until 2026-03-20 07:30:18.380816815 +0000 UTC m=+1054.930474462 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ba67eb0-3c0d-4558-b603-3626f3980dad-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" (UID: "8ba67eb0-3c0d-4558-b603-3626f3980dad") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.676023 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-dhsjc" event={"ID":"f5230399-dbb2-4a03-afcb-58dd2c1fdd22","Type":"ContainerStarted","Data":"b5342260bcac3f4c5eea7c6766996a0ce75409167227dd3a7d5ffd6c904dcd2d"} Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.677719 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p867r" event={"ID":"75ab6716-99ca-4fd9-a632-0bc69d5c3742","Type":"ContainerStarted","Data":"fb58f7c5d59d1560a80758633f30ef43d824ac5ba807c8c11a5074d7f83782cc"} Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.678759 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-nqz8s" event={"ID":"9d6d1c42-480e-49ac-8a40-233fb95e4a0a","Type":"ContainerStarted","Data":"a751b59c946640a51cd695c225870f49f56b0b5d89002176f8625e875b559ed8"} Mar 20 07:30:16 crc kubenswrapper[4749]: E0320 07:30:16.680607 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p867r" podUID="75ab6716-99ca-4fd9-a632-0bc69d5c3742" Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.681415 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-kmzst" event={"ID":"b4c290c3-309d-4706-935a-0e33bf4e403b","Type":"ContainerStarted","Data":"b4b4fe45da127585c88231aaa5e84f59a74975cd01534303195724dabf0715c1"} Mar 20 07:30:16 crc kubenswrapper[4749]: E0320 07:30:16.682382 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-kmzst" podUID="b4c290c3-309d-4706-935a-0e33bf4e403b" Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.683633 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6pbbk" event={"ID":"ddbd9da2-a48e-4e49-894e-4a9ae1109a73","Type":"ContainerStarted","Data":"660f479b030acd87d4ad5460b913d5fc72cc9707644fe574dd60c1b470dd61fb"} Mar 20 07:30:16 crc kubenswrapper[4749]: E0320 07:30:16.684721 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6pbbk" podUID="ddbd9da2-a48e-4e49-894e-4a9ae1109a73" Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.685065 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-548s6" event={"ID":"cd82533b-5f9e-45e3-a645-90e678bcbf4a","Type":"ContainerStarted","Data":"89ccfcf9c21f0e7edd0758006e583100e2d3984dd826f2221f6e463bf74267cb"} Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.688418 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-86v48" event={"ID":"589f626e-af46-4f5e-98f6-d4ad787f84d8","Type":"ContainerStarted","Data":"2b641c35bea06b7f1740eda49d4458728b80f9c5a933bc2f3c55c7eed2c2c778"} Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.691783 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpwsq" event={"ID":"fa996ed9-64cd-4371-80e7-8122c77285fc","Type":"ContainerStarted","Data":"ad09336a5e35ba49d4a33f2f21705488d75be34938b426a1f0dd40a0a46a406a"} Mar 20 07:30:16 crc kubenswrapper[4749]: E0320 07:30:16.694219 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-86v48" podUID="589f626e-af46-4f5e-98f6-d4ad787f84d8" Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.696517 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-mscpf" event={"ID":"6b2dc985-5b75-4bc6-8c79-392034f38960","Type":"ContainerStarted","Data":"c331e60672eed4d00aa32fd22528da39bea4db6d556ad982c5e6f140172a1539"} Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.698833 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bpmp" event={"ID":"a56dfc81-4a0f-4e99-a884-cff054d164b9","Type":"ContainerStarted","Data":"7038f59dc4f87f74c907e62828b1265e5a8b4e5781760d71ad4d44e33c2b29ad"} Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.701038 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpgkx" event={"ID":"a0cd89a4-110c-4df5-b9ce-186f38d9be30","Type":"ContainerStarted","Data":"f9857a3f0bd081aa5a1e9870747ff65c99b969a720c0bf96f448031916a92682"} Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.701980 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dq6v4" event={"ID":"7161b86e-8178-40db-a6a3-71f724746aed","Type":"ContainerStarted","Data":"97e9b6382f9949e225c3fba6ada098a253f4e2164945600c48974794100daa36"} Mar 20 07:30:16 crc kubenswrapper[4749]: E0320 07:30:16.705801 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dq6v4" podUID="7161b86e-8178-40db-a6a3-71f724746aed" Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.705843 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-289b6" event={"ID":"a15a8919-b4d7-418a-b725-38e7d7b0e859","Type":"ContainerStarted","Data":"705d0637619201f48b6f9ae88d2d8fc34f9032213e0253dfe4ccd5b3c1dda378"} Mar 20 07:30:16 crc kubenswrapper[4749]: E0320 07:30:16.707232 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-289b6" podUID="a15a8919-b4d7-418a-b725-38e7d7b0e859" Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.707430 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-6kn5g" event={"ID":"a0e5f3af-b138-43f6-b007-ca56ec51851c","Type":"ContainerStarted","Data":"93979f07492b5c79fdfe888e2126c4d02c49adc3a63c8f8faf09a7914236949c"} Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.718470 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-5jlcc" event={"ID":"cee7836b-e12f-4de9-be6b-4caa60294269","Type":"ContainerStarted","Data":"f7b9b6e62bba74406d55065ad59e4d5e093b4fcafc012d8aa13f23ddd91fb96b"} Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.904400 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-webhook-certs\") pod \"openstack-operator-controller-manager-65c7c8696f-s7w78\" (UID: \"461831bb-9c93-49f8-a32e-ec01c4bdc549\") " pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:30:16 crc kubenswrapper[4749]: I0320 07:30:16.904507 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-metrics-certs\") pod \"openstack-operator-controller-manager-65c7c8696f-s7w78\" (UID: \"461831bb-9c93-49f8-a32e-ec01c4bdc549\") " pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:30:16 crc kubenswrapper[4749]: E0320 07:30:16.904696 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 07:30:16 crc kubenswrapper[4749]: E0320 07:30:16.904760 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-metrics-certs podName:461831bb-9c93-49f8-a32e-ec01c4bdc549 nodeName:}" failed. No retries permitted until 2026-03-20 07:30:18.904740127 +0000 UTC m=+1055.454397774 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-metrics-certs") pod "openstack-operator-controller-manager-65c7c8696f-s7w78" (UID: "461831bb-9c93-49f8-a32e-ec01c4bdc549") : secret "metrics-server-cert" not found Mar 20 07:30:16 crc kubenswrapper[4749]: E0320 07:30:16.904774 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 07:30:16 crc kubenswrapper[4749]: E0320 07:30:16.904868 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-webhook-certs podName:461831bb-9c93-49f8-a32e-ec01c4bdc549 nodeName:}" failed. No retries permitted until 2026-03-20 07:30:18.904826499 +0000 UTC m=+1055.454484146 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-webhook-certs") pod "openstack-operator-controller-manager-65c7c8696f-s7w78" (UID: "461831bb-9c93-49f8-a32e-ec01c4bdc549") : secret "webhook-server-cert" not found Mar 20 07:30:17 crc kubenswrapper[4749]: E0320 07:30:17.741839 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-289b6" podUID="a15a8919-b4d7-418a-b725-38e7d7b0e859" Mar 20 07:30:17 crc kubenswrapper[4749]: E0320 07:30:17.741846 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6pbbk" podUID="ddbd9da2-a48e-4e49-894e-4a9ae1109a73" Mar 20 07:30:17 crc kubenswrapper[4749]: E0320 07:30:17.742093 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-kmzst" podUID="b4c290c3-309d-4706-935a-0e33bf4e403b" Mar 20 07:30:17 crc kubenswrapper[4749]: E0320 07:30:17.742173 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-86v48" podUID="589f626e-af46-4f5e-98f6-d4ad787f84d8" Mar 20 07:30:17 crc kubenswrapper[4749]: E0320 07:30:17.742230 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dq6v4" podUID="7161b86e-8178-40db-a6a3-71f724746aed" Mar 20 07:30:17 crc kubenswrapper[4749]: E0320 07:30:17.744615 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p867r" podUID="75ab6716-99ca-4fd9-a632-0bc69d5c3742" Mar 20 07:30:18 crc kubenswrapper[4749]: I0320 07:30:18.036190 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5af1049c-beed-4d2a-93da-95171c0142e3-cert\") pod \"infra-operator-controller-manager-5d9899ccc6-2x44r\" (UID: \"5af1049c-beed-4d2a-93da-95171c0142e3\") " pod="openstack-operators/infra-operator-controller-manager-5d9899ccc6-2x44r" Mar 20 07:30:18 crc kubenswrapper[4749]: E0320 07:30:18.036368 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 07:30:18 crc kubenswrapper[4749]: E0320 07:30:18.036571 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5af1049c-beed-4d2a-93da-95171c0142e3-cert podName:5af1049c-beed-4d2a-93da-95171c0142e3 nodeName:}" failed. No retries permitted until 2026-03-20 07:30:22.036554893 +0000 UTC m=+1058.586212540 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5af1049c-beed-4d2a-93da-95171c0142e3-cert") pod "infra-operator-controller-manager-5d9899ccc6-2x44r" (UID: "5af1049c-beed-4d2a-93da-95171c0142e3") : secret "infra-operator-webhook-server-cert" not found Mar 20 07:30:18 crc kubenswrapper[4749]: I0320 07:30:18.440756 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba67eb0-3c0d-4558-b603-3626f3980dad-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-4qwrj\" (UID: \"8ba67eb0-3c0d-4558-b603-3626f3980dad\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" Mar 20 07:30:18 crc kubenswrapper[4749]: E0320 07:30:18.440976 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 07:30:18 crc kubenswrapper[4749]: E0320 07:30:18.441085 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ba67eb0-3c0d-4558-b603-3626f3980dad-cert podName:8ba67eb0-3c0d-4558-b603-3626f3980dad nodeName:}" failed. No retries permitted until 2026-03-20 07:30:22.44106212 +0000 UTC m=+1058.990719767 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ba67eb0-3c0d-4558-b603-3626f3980dad-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" (UID: "8ba67eb0-3c0d-4558-b603-3626f3980dad") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 07:30:18 crc kubenswrapper[4749]: I0320 07:30:18.953366 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-webhook-certs\") pod \"openstack-operator-controller-manager-65c7c8696f-s7w78\" (UID: \"461831bb-9c93-49f8-a32e-ec01c4bdc549\") " pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:30:18 crc kubenswrapper[4749]: I0320 07:30:18.953436 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-metrics-certs\") pod \"openstack-operator-controller-manager-65c7c8696f-s7w78\" (UID: \"461831bb-9c93-49f8-a32e-ec01c4bdc549\") " pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:30:18 crc kubenswrapper[4749]: E0320 07:30:18.953617 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 07:30:18 crc kubenswrapper[4749]: E0320 07:30:18.953668 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-metrics-certs podName:461831bb-9c93-49f8-a32e-ec01c4bdc549 nodeName:}" failed. No retries permitted until 2026-03-20 07:30:22.953654536 +0000 UTC m=+1059.503312183 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-metrics-certs") pod "openstack-operator-controller-manager-65c7c8696f-s7w78" (UID: "461831bb-9c93-49f8-a32e-ec01c4bdc549") : secret "metrics-server-cert" not found Mar 20 07:30:18 crc kubenswrapper[4749]: E0320 07:30:18.953667 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 07:30:18 crc kubenswrapper[4749]: E0320 07:30:18.953810 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-webhook-certs podName:461831bb-9c93-49f8-a32e-ec01c4bdc549 nodeName:}" failed. No retries permitted until 2026-03-20 07:30:22.953776639 +0000 UTC m=+1059.503434286 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-webhook-certs") pod "openstack-operator-controller-manager-65c7c8696f-s7w78" (UID: "461831bb-9c93-49f8-a32e-ec01c4bdc549") : secret "webhook-server-cert" not found Mar 20 07:30:22 crc kubenswrapper[4749]: I0320 07:30:22.097072 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5af1049c-beed-4d2a-93da-95171c0142e3-cert\") pod \"infra-operator-controller-manager-5d9899ccc6-2x44r\" (UID: \"5af1049c-beed-4d2a-93da-95171c0142e3\") " pod="openstack-operators/infra-operator-controller-manager-5d9899ccc6-2x44r" Mar 20 07:30:22 crc kubenswrapper[4749]: E0320 07:30:22.097483 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 07:30:22 crc kubenswrapper[4749]: E0320 07:30:22.097699 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5af1049c-beed-4d2a-93da-95171c0142e3-cert podName:5af1049c-beed-4d2a-93da-95171c0142e3 nodeName:}" failed. No retries permitted until 2026-03-20 07:30:30.097683858 +0000 UTC m=+1066.647341505 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5af1049c-beed-4d2a-93da-95171c0142e3-cert") pod "infra-operator-controller-manager-5d9899ccc6-2x44r" (UID: "5af1049c-beed-4d2a-93da-95171c0142e3") : secret "infra-operator-webhook-server-cert" not found Mar 20 07:30:22 crc kubenswrapper[4749]: I0320 07:30:22.504734 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba67eb0-3c0d-4558-b603-3626f3980dad-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-4qwrj\" (UID: \"8ba67eb0-3c0d-4558-b603-3626f3980dad\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" Mar 20 07:30:22 crc kubenswrapper[4749]: E0320 07:30:22.504907 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 07:30:22 crc kubenswrapper[4749]: E0320 07:30:22.504971 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ba67eb0-3c0d-4558-b603-3626f3980dad-cert podName:8ba67eb0-3c0d-4558-b603-3626f3980dad nodeName:}" failed. No retries permitted until 2026-03-20 07:30:30.504955693 +0000 UTC m=+1067.054613340 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ba67eb0-3c0d-4558-b603-3626f3980dad-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" (UID: "8ba67eb0-3c0d-4558-b603-3626f3980dad") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 07:30:23 crc kubenswrapper[4749]: I0320 07:30:23.011139 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-webhook-certs\") pod \"openstack-operator-controller-manager-65c7c8696f-s7w78\" (UID: \"461831bb-9c93-49f8-a32e-ec01c4bdc549\") " pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:30:23 crc kubenswrapper[4749]: I0320 07:30:23.011213 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-metrics-certs\") pod \"openstack-operator-controller-manager-65c7c8696f-s7w78\" (UID: \"461831bb-9c93-49f8-a32e-ec01c4bdc549\") " pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:30:23 crc kubenswrapper[4749]: E0320 07:30:23.011509 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 07:30:23 crc kubenswrapper[4749]: E0320 07:30:23.011557 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-metrics-certs podName:461831bb-9c93-49f8-a32e-ec01c4bdc549 nodeName:}" failed. No retries permitted until 2026-03-20 07:30:31.011543153 +0000 UTC m=+1067.561200800 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-metrics-certs") pod "openstack-operator-controller-manager-65c7c8696f-s7w78" (UID: "461831bb-9c93-49f8-a32e-ec01c4bdc549") : secret "metrics-server-cert" not found Mar 20 07:30:23 crc kubenswrapper[4749]: E0320 07:30:23.013526 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 07:30:23 crc kubenswrapper[4749]: E0320 07:30:23.013680 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-webhook-certs podName:461831bb-9c93-49f8-a32e-ec01c4bdc549 nodeName:}" failed. No retries permitted until 2026-03-20 07:30:31.013656274 +0000 UTC m=+1067.563314021 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-webhook-certs") pod "openstack-operator-controller-manager-65c7c8696f-s7w78" (UID: "461831bb-9c93-49f8-a32e-ec01c4bdc549") : secret "webhook-server-cert" not found Mar 20 07:30:28 crc kubenswrapper[4749]: E0320 07:30:28.609638 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d" Mar 20 07:30:28 crc kubenswrapper[4749]: E0320 07:30:28.610146 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kx75r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-79df6bcc97-t97b7_openstack-operators(06c975b5-ec27-4ff9-b7bb-115c12275ac2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 07:30:28 crc kubenswrapper[4749]: E0320 07:30:28.611322 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-t97b7" podUID="06c975b5-ec27-4ff9-b7bb-115c12275ac2" Mar 20 07:30:28 crc kubenswrapper[4749]: E0320 07:30:28.832091 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d\\\"\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-t97b7" podUID="06c975b5-ec27-4ff9-b7bb-115c12275ac2" Mar 20 07:30:29 crc kubenswrapper[4749]: E0320 07:30:29.384331 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a" Mar 20 07:30:29 crc kubenswrapper[4749]: E0320 07:30:29.384858 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p6wh2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-6kn5g_openstack-operators(a0e5f3af-b138-43f6-b007-ca56ec51851c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 07:30:29 crc kubenswrapper[4749]: E0320 07:30:29.386053 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-6kn5g" podUID="a0e5f3af-b138-43f6-b007-ca56ec51851c" Mar 20 07:30:29 crc kubenswrapper[4749]: E0320 07:30:29.836701 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-6kn5g" podUID="a0e5f3af-b138-43f6-b007-ca56ec51851c" Mar 20 07:30:29 crc kubenswrapper[4749]: E0320 07:30:29.913572 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 20 07:30:29 crc kubenswrapper[4749]: E0320 07:30:29.913777 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4j688,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-fpgkx_openstack-operators(a0cd89a4-110c-4df5-b9ce-186f38d9be30): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 07:30:29 crc kubenswrapper[4749]: E0320 07:30:29.914956 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpgkx" podUID="a0cd89a4-110c-4df5-b9ce-186f38d9be30" Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.121121 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5af1049c-beed-4d2a-93da-95171c0142e3-cert\") pod \"infra-operator-controller-manager-5d9899ccc6-2x44r\" (UID: \"5af1049c-beed-4d2a-93da-95171c0142e3\") " pod="openstack-operators/infra-operator-controller-manager-5d9899ccc6-2x44r" Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.132118 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5af1049c-beed-4d2a-93da-95171c0142e3-cert\") pod \"infra-operator-controller-manager-5d9899ccc6-2x44r\" (UID: \"5af1049c-beed-4d2a-93da-95171c0142e3\") " pod="openstack-operators/infra-operator-controller-manager-5d9899ccc6-2x44r" Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.239150 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5d9899ccc6-2x44r" Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.537701 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba67eb0-3c0d-4558-b603-3626f3980dad-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-4qwrj\" (UID: \"8ba67eb0-3c0d-4558-b603-3626f3980dad\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" Mar 20 07:30:30 crc kubenswrapper[4749]: E0320 07:30:30.537947 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 07:30:30 crc kubenswrapper[4749]: E0320 07:30:30.538086 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ba67eb0-3c0d-4558-b603-3626f3980dad-cert podName:8ba67eb0-3c0d-4558-b603-3626f3980dad nodeName:}" failed. No retries permitted until 2026-03-20 07:30:46.538069836 +0000 UTC m=+1083.087727483 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ba67eb0-3c0d-4558-b603-3626f3980dad-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" (UID: "8ba67eb0-3c0d-4558-b603-3626f3980dad") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.640189 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5d9899ccc6-2x44r"] Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.848760 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-mscpf" event={"ID":"6b2dc985-5b75-4bc6-8c79-392034f38960","Type":"ContainerStarted","Data":"19691c0e15a6f53951adb839614841a55dc4de9a70a03ae51c5e69a0ba09d81d"} Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.849150 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-mscpf" Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.850049 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bpmp" event={"ID":"a56dfc81-4a0f-4e99-a884-cff054d164b9","Type":"ContainerStarted","Data":"6c46c5eeb8674bf879557f6803c5fc6c88ad036a4833ccc2aeb2b8256feaef41"} Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.850240 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bpmp" Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.857780 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mfsfk" event={"ID":"26434b2d-c04d-42b7-9631-6d0851886141","Type":"ContainerStarted","Data":"14a1c272f1460a2aa95ecc28e3ed5750ec05e524e8881cd6d8015ca20e444ce0"} Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.858406 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mfsfk" Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.862836 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-bbhfb" event={"ID":"99111621-16af-4be2-b4d4-ce9b82e41165","Type":"ContainerStarted","Data":"22592aeaf3134ac3367098fee076012e2a3647f105322753def000aeddd8b807"} Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.862968 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-bbhfb" Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.868129 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpwsq" event={"ID":"fa996ed9-64cd-4371-80e7-8122c77285fc","Type":"ContainerStarted","Data":"9d4f596755aa80c4eb292f4dbd0c4d942fe2691fcce1a4d370ebc3befc971ab9"} Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.868268 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpwsq" Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.872991 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-dhsjc" event={"ID":"f5230399-dbb2-4a03-afcb-58dd2c1fdd22","Type":"ContainerStarted","Data":"afd9d92370ec4bf3aa8395464f2f24898ce5112d974f14bfcde3a6e47eda12e3"} Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.873220 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-dhsjc" Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.881944 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-nqz8s" event={"ID":"9d6d1c42-480e-49ac-8a40-233fb95e4a0a","Type":"ContainerStarted","Data":"71e9f5d3f4f1baf373df8b26177eb3f0c83fb46e17e5aa7e3d5497492d002508"} Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.883021 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-9t9xf" event={"ID":"88958bd4-4087-4f7c-b72e-9c2cea412993","Type":"ContainerStarted","Data":"d9963bb231a7ac1579a6e648e509466f9c1ba238ccdc203a0009eccdb7aca4ba"} Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.883232 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-9t9xf" Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.890490 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-gdxrh" event={"ID":"b97ffca4-e4a1-4fbf-8271-d97410ffa49a","Type":"ContainerStarted","Data":"7af41ff43d27146952a6bb7cdc81ff18c8b1bce6cde129a6072057307502e713"} Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.890820 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-gdxrh" Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.904142 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-548s6" event={"ID":"cd82533b-5f9e-45e3-a645-90e678bcbf4a","Type":"ContainerStarted","Data":"6725406f13593b04c6531479044b605d304db647d0b9f729058560eece8e7733"} Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.904260 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-548s6" Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.915898 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-5jlcc" event={"ID":"cee7836b-e12f-4de9-be6b-4caa60294269","Type":"ContainerStarted","Data":"e80aa1e3e3f53a5f3e4689de2dfc7107139d6533ef32559273189fd53131b4e0"} Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.916231 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-5jlcc" Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.921333 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5d9899ccc6-2x44r" event={"ID":"5af1049c-beed-4d2a-93da-95171c0142e3","Type":"ContainerStarted","Data":"2ce1ae597a564c9ca0338c070e6f9cfd7a944df8f057980722a93d60a97ac182"} Mar 20 07:30:30 crc kubenswrapper[4749]: E0320 07:30:30.924069 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpgkx" podUID="a0cd89a4-110c-4df5-b9ce-186f38d9be30" Mar 20 07:30:30 crc kubenswrapper[4749]: I0320 07:30:30.944575 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-mscpf" podStartSLOduration=2.9083792859999997 podStartE2EDuration="16.944559432s" podCreationTimestamp="2026-03-20 07:30:14 +0000 UTC" firstStartedPulling="2026-03-20 07:30:15.894332174 +0000 UTC m=+1052.443989821" lastFinishedPulling="2026-03-20 07:30:29.93051232 +0000 UTC m=+1066.480169967" observedRunningTime="2026-03-20 07:30:30.94201306 +0000 UTC m=+1067.491670707" watchObservedRunningTime="2026-03-20 07:30:30.944559432 +0000 UTC m=+1067.494217079" Mar 20 07:30:31 crc kubenswrapper[4749]: I0320 07:30:31.000802 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-nqz8s" podStartSLOduration=3.1660865129999998 podStartE2EDuration="17.00078857s" podCreationTimestamp="2026-03-20 07:30:14 +0000 UTC" firstStartedPulling="2026-03-20 07:30:16.108371839 +0000 UTC m=+1052.658029486" lastFinishedPulling="2026-03-20 07:30:29.943073856 +0000 UTC m=+1066.492731543" observedRunningTime="2026-03-20 07:30:31.000515783 +0000 UTC m=+1067.550173440" watchObservedRunningTime="2026-03-20 07:30:31.00078857 +0000 UTC m=+1067.550446217" Mar 20 07:30:31 crc kubenswrapper[4749]: I0320 07:30:31.051675 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-webhook-certs\") pod \"openstack-operator-controller-manager-65c7c8696f-s7w78\" (UID: \"461831bb-9c93-49f8-a32e-ec01c4bdc549\") " pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:30:31 crc kubenswrapper[4749]: E0320 07:30:31.051741 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 07:30:31 crc kubenswrapper[4749]: I0320 07:30:31.051996 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-metrics-certs\") pod \"openstack-operator-controller-manager-65c7c8696f-s7w78\" (UID: \"461831bb-9c93-49f8-a32e-ec01c4bdc549\") " pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:30:31 crc kubenswrapper[4749]: E0320 07:30:31.052123 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-webhook-certs podName:461831bb-9c93-49f8-a32e-ec01c4bdc549 nodeName:}" failed. No retries permitted until 2026-03-20 07:30:47.052009675 +0000 UTC m=+1083.601667322 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-webhook-certs") pod "openstack-operator-controller-manager-65c7c8696f-s7w78" (UID: "461831bb-9c93-49f8-a32e-ec01c4bdc549") : secret "webhook-server-cert" not found Mar 20 07:30:31 crc kubenswrapper[4749]: E0320 07:30:31.052130 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 07:30:31 crc kubenswrapper[4749]: E0320 07:30:31.052268 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-metrics-certs podName:461831bb-9c93-49f8-a32e-ec01c4bdc549 nodeName:}" failed. No retries permitted until 2026-03-20 07:30:47.052260292 +0000 UTC m=+1083.601917939 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-metrics-certs") pod "openstack-operator-controller-manager-65c7c8696f-s7w78" (UID: "461831bb-9c93-49f8-a32e-ec01c4bdc549") : secret "metrics-server-cert" not found Mar 20 07:30:31 crc kubenswrapper[4749]: I0320 07:30:31.062328 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-dhsjc" podStartSLOduration=3.100515578 podStartE2EDuration="17.062313636s" podCreationTimestamp="2026-03-20 07:30:14 +0000 UTC" firstStartedPulling="2026-03-20 07:30:16.03192214 +0000 UTC m=+1052.581579787" lastFinishedPulling="2026-03-20 07:30:29.993720198 +0000 UTC m=+1066.543377845" observedRunningTime="2026-03-20 07:30:31.061436244 +0000 UTC m=+1067.611093891" watchObservedRunningTime="2026-03-20 07:30:31.062313636 +0000 UTC m=+1067.611971283" Mar 20 07:30:31 crc kubenswrapper[4749]: I0320 07:30:31.107321 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bpmp" podStartSLOduration=2.827473279 podStartE2EDuration="17.10730553s" podCreationTimestamp="2026-03-20 07:30:14 +0000 UTC" firstStartedPulling="2026-03-20 07:30:15.659034662 +0000 UTC m=+1052.208692309" lastFinishedPulling="2026-03-20 07:30:29.938866913 +0000 UTC m=+1066.488524560" observedRunningTime="2026-03-20 07:30:31.105071415 +0000 UTC m=+1067.654729052" watchObservedRunningTime="2026-03-20 07:30:31.10730553 +0000 UTC m=+1067.656963177" Mar 20 07:30:31 crc kubenswrapper[4749]: I0320 07:30:31.169990 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-gdxrh" podStartSLOduration=2.725493558 podStartE2EDuration="17.169971954s" podCreationTimestamp="2026-03-20 07:30:14 +0000 UTC" firstStartedPulling="2026-03-20 07:30:15.485907011 +0000 UTC m=+1052.035564648" lastFinishedPulling="2026-03-20 07:30:29.930385397 +0000 UTC m=+1066.480043044" observedRunningTime="2026-03-20 07:30:31.133506808 +0000 UTC m=+1067.683164455" watchObservedRunningTime="2026-03-20 07:30:31.169971954 +0000 UTC m=+1067.719629601" Mar 20 07:30:31 crc kubenswrapper[4749]: I0320 07:30:31.205206 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-9t9xf" podStartSLOduration=2.700646504 podStartE2EDuration="17.205188361s" podCreationTimestamp="2026-03-20 07:30:14 +0000 UTC" firstStartedPulling="2026-03-20 07:30:15.485679496 +0000 UTC m=+1052.035337143" lastFinishedPulling="2026-03-20 07:30:29.990221353 +0000 UTC m=+1066.539879000" observedRunningTime="2026-03-20 07:30:31.199621435 +0000 UTC m=+1067.749279082" watchObservedRunningTime="2026-03-20 07:30:31.205188361 +0000 UTC m=+1067.754846008" Mar 20 07:30:31 crc kubenswrapper[4749]: I0320 07:30:31.206987 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-bbhfb" podStartSLOduration=2.144421026 podStartE2EDuration="17.206981474s" podCreationTimestamp="2026-03-20 07:30:14 +0000 UTC" firstStartedPulling="2026-03-20 07:30:14.829047357 +0000 UTC m=+1051.378704994" lastFinishedPulling="2026-03-20 07:30:29.891607795 +0000 UTC m=+1066.441265442" observedRunningTime="2026-03-20 07:30:31.175821976 +0000 UTC m=+1067.725479623" watchObservedRunningTime="2026-03-20 07:30:31.206981474 +0000 UTC m=+1067.756639121" Mar 20 07:30:31 crc kubenswrapper[4749]: I0320 07:30:31.250498 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpwsq" podStartSLOduration=3.153012564 podStartE2EDuration="17.250480802s" podCreationTimestamp="2026-03-20 07:30:14 +0000 UTC" firstStartedPulling="2026-03-20 07:30:15.845738832 +0000 UTC m=+1052.395396479" lastFinishedPulling="2026-03-20 07:30:29.94320708 +0000 UTC m=+1066.492864717" observedRunningTime="2026-03-20 07:30:31.249670503 +0000 UTC m=+1067.799328150" watchObservedRunningTime="2026-03-20 07:30:31.250480802 +0000 UTC m=+1067.800138439" Mar 20 07:30:31 crc kubenswrapper[4749]: I0320 07:30:31.297163 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-548s6" podStartSLOduration=3.148876165 podStartE2EDuration="17.297131786s" podCreationTimestamp="2026-03-20 07:30:14 +0000 UTC" firstStartedPulling="2026-03-20 07:30:15.794873286 +0000 UTC m=+1052.344530933" lastFinishedPulling="2026-03-20 07:30:29.943128907 +0000 UTC m=+1066.492786554" observedRunningTime="2026-03-20 07:30:31.294598635 +0000 UTC m=+1067.844256282" watchObservedRunningTime="2026-03-20 07:30:31.297131786 +0000 UTC m=+1067.846789433" Mar 20 07:30:31 crc kubenswrapper[4749]: I0320 07:30:31.344173 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mfsfk" podStartSLOduration=2.849263148 podStartE2EDuration="17.34415698s" podCreationTimestamp="2026-03-20 07:30:14 +0000 UTC" firstStartedPulling="2026-03-20 07:30:15.496671423 +0000 UTC m=+1052.046329070" lastFinishedPulling="2026-03-20 07:30:29.991565255 +0000 UTC m=+1066.541222902" observedRunningTime="2026-03-20 07:30:31.339453446 +0000 UTC m=+1067.889111093" watchObservedRunningTime="2026-03-20 07:30:31.34415698 +0000 UTC m=+1067.893814627" Mar 20 07:30:31 crc kubenswrapper[4749]: I0320 07:30:31.375345 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-5jlcc" podStartSLOduration=3.145670546 podStartE2EDuration="17.375321238s" podCreationTimestamp="2026-03-20 07:30:14 +0000 UTC" firstStartedPulling="2026-03-20 07:30:15.713491615 +0000 UTC m=+1052.263149262" lastFinishedPulling="2026-03-20 07:30:29.943142307 +0000 UTC m=+1066.492799954" observedRunningTime="2026-03-20 07:30:31.371666979 +0000 UTC m=+1067.921324636" watchObservedRunningTime="2026-03-20 07:30:31.375321238 +0000 UTC m=+1067.924978895" Mar 20 07:30:31 crc kubenswrapper[4749]: I0320 07:30:31.941043 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-nqz8s" Mar 20 07:30:35 crc kubenswrapper[4749]: I0320 07:30:35.046714 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-dhsjc" Mar 20 07:30:35 crc kubenswrapper[4749]: I0320 07:30:35.112428 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-nqz8s" Mar 20 07:30:39 crc kubenswrapper[4749]: I0320 07:30:39.008153 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-86v48" event={"ID":"589f626e-af46-4f5e-98f6-d4ad787f84d8","Type":"ContainerStarted","Data":"be47bc2c296f1200e6f4b7d8b693d9c0aa707881f8e2d4efbb29b8d327b5537d"} Mar 20 07:30:39 crc kubenswrapper[4749]: I0320 07:30:39.011630 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-kmzst" event={"ID":"b4c290c3-309d-4706-935a-0e33bf4e403b","Type":"ContainerStarted","Data":"e8cf6adb7b4790fafaed5113cce9cdceeadad3b8ef20dffdfef70f60e468bc8a"} Mar 20 07:30:39 crc kubenswrapper[4749]: I0320 07:30:39.011891 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-kmzst" Mar 20 07:30:39 crc kubenswrapper[4749]: I0320 07:30:39.012917 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6pbbk" event={"ID":"ddbd9da2-a48e-4e49-894e-4a9ae1109a73","Type":"ContainerStarted","Data":"e19a0e9385694e8fdfffc118a17f8a269b70ab277b285fe1770f4134c5bbd9e2"} Mar 20 07:30:39 crc kubenswrapper[4749]: I0320 07:30:39.013045 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6pbbk" Mar 20 07:30:39 crc kubenswrapper[4749]: I0320 07:30:39.014254 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-289b6" event={"ID":"a15a8919-b4d7-418a-b725-38e7d7b0e859","Type":"ContainerStarted","Data":"9ae0dd03e5d177e017c745544cfc41c362260f169f13e6ec2b9299ba00ca492a"} Mar 20 07:30:39 crc kubenswrapper[4749]: I0320 07:30:39.014445 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-289b6" Mar 20 07:30:39 crc kubenswrapper[4749]: I0320 07:30:39.015671 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5d9899ccc6-2x44r" event={"ID":"5af1049c-beed-4d2a-93da-95171c0142e3","Type":"ContainerStarted","Data":"dab5466001dcc0f958210a9465ee5be2ae7ae38c8c4f6111cecce40c07d1c959"} Mar 20 07:30:39 crc kubenswrapper[4749]: I0320 07:30:39.015732 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5d9899ccc6-2x44r" Mar 20 07:30:39 crc kubenswrapper[4749]: I0320 07:30:39.017314 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p867r" event={"ID":"75ab6716-99ca-4fd9-a632-0bc69d5c3742","Type":"ContainerStarted","Data":"4c8273502c91223ac3175242e1ae21578307ce61511ea50bfbb37b378a5731f3"} Mar 20 07:30:39 crc kubenswrapper[4749]: I0320 07:30:39.017525 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p867r" Mar 20 07:30:39 crc kubenswrapper[4749]: I0320 07:30:39.018552 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dq6v4" event={"ID":"7161b86e-8178-40db-a6a3-71f724746aed","Type":"ContainerStarted","Data":"5fdaa692e5ddb5aeedf3ccbdcd7c892913327ec4cd152d4ebcdf790f4308f862"} Mar 20 07:30:39 crc kubenswrapper[4749]: I0320 07:30:39.018816 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dq6v4" Mar 20 07:30:39 crc kubenswrapper[4749]: I0320 07:30:39.036654 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-86v48" podStartSLOduration=2.626423238 podStartE2EDuration="25.036629959s" podCreationTimestamp="2026-03-20 07:30:14 +0000 UTC" firstStartedPulling="2026-03-20 07:30:16.132579328 +0000 UTC m=+1052.682236975" lastFinishedPulling="2026-03-20 07:30:38.542786039 +0000 UTC m=+1075.092443696" observedRunningTime="2026-03-20 07:30:39.029981107 +0000 UTC m=+1075.579638764" watchObservedRunningTime="2026-03-20 07:30:39.036629959 +0000 UTC m=+1075.586287616" Mar 20 07:30:39 crc kubenswrapper[4749]: I0320 07:30:39.054567 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-kmzst" podStartSLOduration=2.710975645 podStartE2EDuration="25.054550735s" podCreationTimestamp="2026-03-20 07:30:14 +0000 UTC" firstStartedPulling="2026-03-20 07:30:16.132383514 +0000 UTC m=+1052.682041161" lastFinishedPulling="2026-03-20 07:30:38.475958594 +0000 UTC m=+1075.025616251" observedRunningTime="2026-03-20 07:30:39.050294352 +0000 UTC m=+1075.599951999" watchObservedRunningTime="2026-03-20 07:30:39.054550735 +0000 UTC m=+1075.604208372" Mar 20 07:30:39 crc kubenswrapper[4749]: I0320 07:30:39.082615 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6pbbk" podStartSLOduration=2.800714557 podStartE2EDuration="25.082596377s" podCreationTimestamp="2026-03-20 07:30:14 +0000 UTC" firstStartedPulling="2026-03-20 07:30:16.194005122 +0000 UTC m=+1052.743662769" lastFinishedPulling="2026-03-20 07:30:38.475886922 +0000 UTC m=+1075.025544589" observedRunningTime="2026-03-20 07:30:39.068955825 +0000 UTC m=+1075.618613472" watchObservedRunningTime="2026-03-20 07:30:39.082596377 +0000 UTC m=+1075.632254024" Mar 20 07:30:39 crc kubenswrapper[4749]: I0320 07:30:39.085587 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5d9899ccc6-2x44r" podStartSLOduration=17.29358719 podStartE2EDuration="25.08557966s" podCreationTimestamp="2026-03-20 07:30:14 +0000 UTC" firstStartedPulling="2026-03-20 07:30:30.683844971 +0000 UTC m=+1067.233502618" lastFinishedPulling="2026-03-20 07:30:38.475837441 +0000 UTC m=+1075.025495088" observedRunningTime="2026-03-20 07:30:39.080867425 +0000 UTC m=+1075.630525072" watchObservedRunningTime="2026-03-20 07:30:39.08557966 +0000 UTC m=+1075.635237307" Mar 20 07:30:39 crc kubenswrapper[4749]: I0320 07:30:39.099471 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-289b6" podStartSLOduration=2.692443233 podStartE2EDuration="25.099458357s" podCreationTimestamp="2026-03-20 07:30:14 +0000 UTC" firstStartedPulling="2026-03-20 07:30:16.068964441 +0000 UTC m=+1052.618622088" lastFinishedPulling="2026-03-20 07:30:38.475979565 +0000 UTC m=+1075.025637212" observedRunningTime="2026-03-20 07:30:39.096110706 +0000 UTC m=+1075.645768353" watchObservedRunningTime="2026-03-20 07:30:39.099458357 +0000 UTC m=+1075.649116004" Mar 20 07:30:39 crc kubenswrapper[4749]: I0320 07:30:39.123586 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p867r" podStartSLOduration=2.825412556 podStartE2EDuration="25.123567313s" podCreationTimestamp="2026-03-20 07:30:14 +0000 UTC" firstStartedPulling="2026-03-20 07:30:16.19312061 +0000 UTC m=+1052.742778257" lastFinishedPulling="2026-03-20 07:30:38.491275357 +0000 UTC m=+1075.040933014" observedRunningTime="2026-03-20 07:30:39.11846402 +0000 UTC m=+1075.668121677" watchObservedRunningTime="2026-03-20 07:30:39.123567313 +0000 UTC m=+1075.673224970" Mar 20 07:30:39 crc kubenswrapper[4749]: I0320 07:30:39.136299 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dq6v4" podStartSLOduration=2.85021024 podStartE2EDuration="25.136269892s" podCreationTimestamp="2026-03-20 07:30:14 +0000 UTC" firstStartedPulling="2026-03-20 07:30:16.193390617 +0000 UTC m=+1052.743048274" lastFinishedPulling="2026-03-20 07:30:38.479450279 +0000 UTC m=+1075.029107926" observedRunningTime="2026-03-20 07:30:39.133574897 +0000 UTC m=+1075.683232544" watchObservedRunningTime="2026-03-20 07:30:39.136269892 +0000 UTC m=+1075.685927539" Mar 20 07:30:44 crc kubenswrapper[4749]: I0320 07:30:44.432627 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-bbhfb" Mar 20 07:30:44 crc kubenswrapper[4749]: I0320 07:30:44.458735 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-gdxrh" Mar 20 07:30:44 crc kubenswrapper[4749]: I0320 07:30:44.463348 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-mfsfk" Mar 20 07:30:44 crc kubenswrapper[4749]: I0320 07:30:44.555419 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-9t9xf" Mar 20 07:30:44 crc kubenswrapper[4749]: I0320 07:30:44.627470 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bpmp" Mar 20 07:30:44 crc kubenswrapper[4749]: I0320 07:30:44.669023 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-5jlcc" Mar 20 07:30:44 crc kubenswrapper[4749]: I0320 07:30:44.843480 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zpwsq" Mar 20 07:30:44 crc kubenswrapper[4749]: I0320 07:30:44.870808 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-mscpf" Mar 20 07:30:44 crc kubenswrapper[4749]: I0320 07:30:44.882427 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-548s6" Mar 20 07:30:45 crc kubenswrapper[4749]: I0320 07:30:45.084562 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-kmzst" Mar 20 07:30:45 crc kubenswrapper[4749]: I0320 07:30:45.191140 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-289b6" Mar 20 07:30:45 crc kubenswrapper[4749]: I0320 07:30:45.222174 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dq6v4" Mar 20 07:30:45 crc kubenswrapper[4749]: I0320 07:30:45.256461 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-p867r" Mar 20 07:30:45 crc kubenswrapper[4749]: I0320 07:30:45.306695 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6pbbk" Mar 20 07:30:46 crc kubenswrapper[4749]: I0320 07:30:46.540040 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba67eb0-3c0d-4558-b603-3626f3980dad-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-4qwrj\" (UID: \"8ba67eb0-3c0d-4558-b603-3626f3980dad\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" Mar 20 07:30:46 crc kubenswrapper[4749]: I0320 07:30:46.561182 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba67eb0-3c0d-4558-b603-3626f3980dad-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-4qwrj\" (UID: \"8ba67eb0-3c0d-4558-b603-3626f3980dad\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" Mar 20 07:30:46 crc kubenswrapper[4749]: I0320 07:30:46.829402 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-2pnpk" Mar 20 07:30:46 crc kubenswrapper[4749]: I0320 07:30:46.838022 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" Mar 20 07:30:47 crc kubenswrapper[4749]: I0320 07:30:47.147551 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-metrics-certs\") pod \"openstack-operator-controller-manager-65c7c8696f-s7w78\" (UID: \"461831bb-9c93-49f8-a32e-ec01c4bdc549\") " pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:30:47 crc kubenswrapper[4749]: I0320 07:30:47.147925 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-webhook-certs\") pod \"openstack-operator-controller-manager-65c7c8696f-s7w78\" (UID: \"461831bb-9c93-49f8-a32e-ec01c4bdc549\") " pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:30:47 crc kubenswrapper[4749]: I0320 07:30:47.153848 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-webhook-certs\") pod \"openstack-operator-controller-manager-65c7c8696f-s7w78\" (UID: \"461831bb-9c93-49f8-a32e-ec01c4bdc549\") " pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:30:47 crc kubenswrapper[4749]: I0320 07:30:47.160592 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/461831bb-9c93-49f8-a32e-ec01c4bdc549-metrics-certs\") pod \"openstack-operator-controller-manager-65c7c8696f-s7w78\" (UID: \"461831bb-9c93-49f8-a32e-ec01c4bdc549\") " pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:30:47 crc kubenswrapper[4749]: I0320 07:30:47.227167 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-wb5rq" Mar 20 07:30:47 crc kubenswrapper[4749]: I0320 07:30:47.235337 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:30:47 crc kubenswrapper[4749]: I0320 07:30:47.581224 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4qwrj"] Mar 20 07:30:47 crc kubenswrapper[4749]: I0320 07:30:47.662853 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78"] Mar 20 07:30:47 crc kubenswrapper[4749]: W0320 07:30:47.664710 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod461831bb_9c93_49f8_a32e_ec01c4bdc549.slice/crio-0dfd076a1acb2ecabbe9f7ca2eaa3d8393eec6da9a24adc9c0508e75864f5b84 WatchSource:0}: Error finding container 0dfd076a1acb2ecabbe9f7ca2eaa3d8393eec6da9a24adc9c0508e75864f5b84: Status 404 returned error can't find the container with id 0dfd076a1acb2ecabbe9f7ca2eaa3d8393eec6da9a24adc9c0508e75864f5b84 Mar 20 07:30:48 crc kubenswrapper[4749]: I0320 07:30:48.098665 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" event={"ID":"8ba67eb0-3c0d-4558-b603-3626f3980dad","Type":"ContainerStarted","Data":"52c7cb7aaef06c69e005cd8de7c00c1bd5ed5a1828894936ed099f2eeb07018d"} Mar 20 07:30:48 crc kubenswrapper[4749]: I0320 07:30:48.101610 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" event={"ID":"461831bb-9c93-49f8-a32e-ec01c4bdc549","Type":"ContainerStarted","Data":"f88ba80ac675f0924d3cf27d77e676121bf883a2c331b0627f695dac5c6ce50a"} Mar 20 07:30:48 crc kubenswrapper[4749]: I0320 07:30:48.101669 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" event={"ID":"461831bb-9c93-49f8-a32e-ec01c4bdc549","Type":"ContainerStarted","Data":"0dfd076a1acb2ecabbe9f7ca2eaa3d8393eec6da9a24adc9c0508e75864f5b84"} Mar 20 07:30:48 crc kubenswrapper[4749]: I0320 07:30:48.103445 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-t97b7" event={"ID":"06c975b5-ec27-4ff9-b7bb-115c12275ac2","Type":"ContainerStarted","Data":"e3d9107c7f9dfc9b7410a71908f75e079755bdaaca323872ff8a82155d5f641e"} Mar 20 07:30:48 crc kubenswrapper[4749]: I0320 07:30:48.103617 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-t97b7" Mar 20 07:30:48 crc kubenswrapper[4749]: I0320 07:30:48.126227 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-t97b7" podStartSLOduration=2.570786985 podStartE2EDuration="34.126207875s" podCreationTimestamp="2026-03-20 07:30:14 +0000 UTC" firstStartedPulling="2026-03-20 07:30:15.485402029 +0000 UTC m=+1052.035059676" lastFinishedPulling="2026-03-20 07:30:47.040822879 +0000 UTC m=+1083.590480566" observedRunningTime="2026-03-20 07:30:48.121370048 +0000 UTC m=+1084.671027715" watchObservedRunningTime="2026-03-20 07:30:48.126207875 +0000 UTC m=+1084.675865532" Mar 20 07:30:49 crc kubenswrapper[4749]: I0320 07:30:49.114769 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:30:49 crc kubenswrapper[4749]: I0320 07:30:49.155117 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" podStartSLOduration=35.155090808 podStartE2EDuration="35.155090808s" podCreationTimestamp="2026-03-20 07:30:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:30:49.154935264 +0000 UTC m=+1085.704592991" watchObservedRunningTime="2026-03-20 07:30:49.155090808 +0000 UTC m=+1085.704748475" Mar 20 07:30:50 crc kubenswrapper[4749]: I0320 07:30:50.129119 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-6kn5g" event={"ID":"a0e5f3af-b138-43f6-b007-ca56ec51851c","Type":"ContainerStarted","Data":"28f489da2cc808bcab5e8029ce6068a6b4aaa1fce734330793bfd530c153d92a"} Mar 20 07:30:50 crc kubenswrapper[4749]: I0320 07:30:50.129648 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-6kn5g" Mar 20 07:30:50 crc kubenswrapper[4749]: I0320 07:30:50.132319 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpgkx" event={"ID":"a0cd89a4-110c-4df5-b9ce-186f38d9be30","Type":"ContainerStarted","Data":"73b956ea7be791034e86ec48a3ead836c7dc8d4369a7451ca55b9e6c717027d8"} Mar 20 07:30:50 crc kubenswrapper[4749]: I0320 07:30:50.132486 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpgkx" Mar 20 07:30:50 crc kubenswrapper[4749]: I0320 07:30:50.149197 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-6kn5g" podStartSLOduration=2.491913778 podStartE2EDuration="36.149181614s" podCreationTimestamp="2026-03-20 07:30:14 +0000 UTC" firstStartedPulling="2026-03-20 07:30:15.901211482 +0000 UTC m=+1052.450869129" lastFinishedPulling="2026-03-20 07:30:49.558479318 +0000 UTC m=+1086.108136965" observedRunningTime="2026-03-20 07:30:50.146752634 +0000 UTC m=+1086.696410271" watchObservedRunningTime="2026-03-20 07:30:50.149181614 +0000 UTC m=+1086.698839261" Mar 20 07:30:50 crc kubenswrapper[4749]: I0320 07:30:50.164128 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpgkx" podStartSLOduration=2.366051697 podStartE2EDuration="36.164107517s" podCreationTimestamp="2026-03-20 07:30:14 +0000 UTC" firstStartedPulling="2026-03-20 07:30:15.760181612 +0000 UTC m=+1052.309839259" lastFinishedPulling="2026-03-20 07:30:49.558237432 +0000 UTC m=+1086.107895079" observedRunningTime="2026-03-20 07:30:50.160829327 +0000 UTC m=+1086.710486994" watchObservedRunningTime="2026-03-20 07:30:50.164107517 +0000 UTC m=+1086.713765174" Mar 20 07:30:50 crc kubenswrapper[4749]: I0320 07:30:50.246141 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5d9899ccc6-2x44r" Mar 20 07:30:51 crc kubenswrapper[4749]: I0320 07:30:51.140508 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" event={"ID":"8ba67eb0-3c0d-4558-b603-3626f3980dad","Type":"ContainerStarted","Data":"c9feffadbcf2df18431855b81167ca0b92cd05a1dd75424b0783cb24ac7d56de"} Mar 20 07:30:51 crc kubenswrapper[4749]: I0320 07:30:51.178133 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" podStartSLOduration=34.000755805 podStartE2EDuration="37.178109087s" podCreationTimestamp="2026-03-20 07:30:14 +0000 UTC" firstStartedPulling="2026-03-20 07:30:47.592771002 +0000 UTC m=+1084.142428649" lastFinishedPulling="2026-03-20 07:30:50.770124284 +0000 UTC m=+1087.319781931" observedRunningTime="2026-03-20 07:30:51.174155171 +0000 UTC m=+1087.723812818" watchObservedRunningTime="2026-03-20 07:30:51.178109087 +0000 UTC m=+1087.727766744" Mar 20 07:30:52 crc kubenswrapper[4749]: I0320 07:30:52.148748 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" Mar 20 07:30:54 crc kubenswrapper[4749]: I0320 07:30:54.213490 4749 scope.go:117] "RemoveContainer" containerID="e41808d9265ea41ff96884562bc18f2b7f0078f95e505a187b00de0ecda610df" Mar 20 07:30:54 crc kubenswrapper[4749]: I0320 07:30:54.516948 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-t97b7" Mar 20 07:30:54 crc kubenswrapper[4749]: I0320 07:30:54.728372 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fpgkx" Mar 20 07:30:54 crc kubenswrapper[4749]: I0320 07:30:54.930630 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-6kn5g" Mar 20 07:30:56 crc kubenswrapper[4749]: I0320 07:30:56.845259 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-4qwrj" Mar 20 07:30:57 crc kubenswrapper[4749]: I0320 07:30:57.242202 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-65c7c8696f-s7w78" Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.568870 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cv4rz"] Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.571108 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-cv4rz" Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.573522 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.573560 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.573615 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-pfdll" Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.573982 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.575873 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cv4rz"] Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.587718 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d010cf-d0a5-4c30-b14b-6f81e05e6ec4-config\") pod \"dnsmasq-dns-675f4bcbfc-cv4rz\" (UID: \"87d010cf-d0a5-4c30-b14b-6f81e05e6ec4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cv4rz" Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.588050 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5qml\" (UniqueName: \"kubernetes.io/projected/87d010cf-d0a5-4c30-b14b-6f81e05e6ec4-kube-api-access-c5qml\") pod \"dnsmasq-dns-675f4bcbfc-cv4rz\" (UID: \"87d010cf-d0a5-4c30-b14b-6f81e05e6ec4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cv4rz" Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.600876 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-l84vp"] Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.602358 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-l84vp" Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.605541 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.626523 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-l84vp"] Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.689574 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5qml\" (UniqueName: \"kubernetes.io/projected/87d010cf-d0a5-4c30-b14b-6f81e05e6ec4-kube-api-access-c5qml\") pod \"dnsmasq-dns-675f4bcbfc-cv4rz\" (UID: \"87d010cf-d0a5-4c30-b14b-6f81e05e6ec4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cv4rz" Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.689721 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-l84vp\" (UID: \"1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l84vp" Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.689772 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b-config\") pod \"dnsmasq-dns-78dd6ddcc-l84vp\" (UID: \"1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l84vp" Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.689855 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcfhj\" (UniqueName: \"kubernetes.io/projected/1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b-kube-api-access-rcfhj\") pod \"dnsmasq-dns-78dd6ddcc-l84vp\" (UID: \"1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l84vp" Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.689916 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d010cf-d0a5-4c30-b14b-6f81e05e6ec4-config\") pod \"dnsmasq-dns-675f4bcbfc-cv4rz\" (UID: \"87d010cf-d0a5-4c30-b14b-6f81e05e6ec4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cv4rz" Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.690761 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d010cf-d0a5-4c30-b14b-6f81e05e6ec4-config\") pod \"dnsmasq-dns-675f4bcbfc-cv4rz\" (UID: \"87d010cf-d0a5-4c30-b14b-6f81e05e6ec4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cv4rz" Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.708274 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5qml\" (UniqueName: \"kubernetes.io/projected/87d010cf-d0a5-4c30-b14b-6f81e05e6ec4-kube-api-access-c5qml\") pod \"dnsmasq-dns-675f4bcbfc-cv4rz\" (UID: \"87d010cf-d0a5-4c30-b14b-6f81e05e6ec4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cv4rz" Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.790970 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-l84vp\" (UID: \"1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l84vp" Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.791027 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b-config\") pod \"dnsmasq-dns-78dd6ddcc-l84vp\" (UID: \"1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l84vp" Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.791073 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcfhj\" (UniqueName: \"kubernetes.io/projected/1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b-kube-api-access-rcfhj\") pod \"dnsmasq-dns-78dd6ddcc-l84vp\" (UID: \"1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l84vp" Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.792257 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b-config\") pod \"dnsmasq-dns-78dd6ddcc-l84vp\" (UID: \"1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l84vp" Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.792278 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-l84vp\" (UID: \"1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l84vp" Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.813789 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcfhj\" (UniqueName: \"kubernetes.io/projected/1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b-kube-api-access-rcfhj\") pod \"dnsmasq-dns-78dd6ddcc-l84vp\" (UID: \"1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-l84vp" Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.887722 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-cv4rz" Mar 20 07:31:16 crc kubenswrapper[4749]: I0320 07:31:16.928732 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-l84vp" Mar 20 07:31:17 crc kubenswrapper[4749]: I0320 07:31:17.372521 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-l84vp"] Mar 20 07:31:17 crc kubenswrapper[4749]: I0320 07:31:17.425041 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cv4rz"] Mar 20 07:31:17 crc kubenswrapper[4749]: W0320 07:31:17.429880 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87d010cf_d0a5_4c30_b14b_6f81e05e6ec4.slice/crio-d0e7884e9edb6ab98934d1d33e5918fa3498ea1558a9b77ff9a58e2bd943a8f0 WatchSource:0}: Error finding container d0e7884e9edb6ab98934d1d33e5918fa3498ea1558a9b77ff9a58e2bd943a8f0: Status 404 returned error can't find the container with id d0e7884e9edb6ab98934d1d33e5918fa3498ea1558a9b77ff9a58e2bd943a8f0 Mar 20 07:31:18 crc kubenswrapper[4749]: I0320 07:31:18.368838 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-cv4rz" event={"ID":"87d010cf-d0a5-4c30-b14b-6f81e05e6ec4","Type":"ContainerStarted","Data":"d0e7884e9edb6ab98934d1d33e5918fa3498ea1558a9b77ff9a58e2bd943a8f0"} Mar 20 07:31:18 crc kubenswrapper[4749]: I0320 07:31:18.369752 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-l84vp" event={"ID":"1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b","Type":"ContainerStarted","Data":"189abc5a481a291d468b34caf0f5c4c24fdb135a54f7660c83cbddd342c0475a"} Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.296937 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cv4rz"] Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.315202 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6mxcw"] Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.319109 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-6mxcw" Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.330947 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6mxcw"] Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.429339 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5792dc4b-6da9-4a1b-9450-62d6748d79cf-dns-svc\") pod \"dnsmasq-dns-666b6646f7-6mxcw\" (UID: \"5792dc4b-6da9-4a1b-9450-62d6748d79cf\") " pod="openstack/dnsmasq-dns-666b6646f7-6mxcw" Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.429506 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5792dc4b-6da9-4a1b-9450-62d6748d79cf-config\") pod \"dnsmasq-dns-666b6646f7-6mxcw\" (UID: \"5792dc4b-6da9-4a1b-9450-62d6748d79cf\") " pod="openstack/dnsmasq-dns-666b6646f7-6mxcw" Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.429688 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pbpf\" (UniqueName: \"kubernetes.io/projected/5792dc4b-6da9-4a1b-9450-62d6748d79cf-kube-api-access-4pbpf\") pod \"dnsmasq-dns-666b6646f7-6mxcw\" (UID: \"5792dc4b-6da9-4a1b-9450-62d6748d79cf\") " pod="openstack/dnsmasq-dns-666b6646f7-6mxcw" Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.530699 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pbpf\" (UniqueName: \"kubernetes.io/projected/5792dc4b-6da9-4a1b-9450-62d6748d79cf-kube-api-access-4pbpf\") pod \"dnsmasq-dns-666b6646f7-6mxcw\" (UID: \"5792dc4b-6da9-4a1b-9450-62d6748d79cf\") " pod="openstack/dnsmasq-dns-666b6646f7-6mxcw" Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.530779 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5792dc4b-6da9-4a1b-9450-62d6748d79cf-dns-svc\") pod \"dnsmasq-dns-666b6646f7-6mxcw\" (UID: \"5792dc4b-6da9-4a1b-9450-62d6748d79cf\") " pod="openstack/dnsmasq-dns-666b6646f7-6mxcw" Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.530809 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5792dc4b-6da9-4a1b-9450-62d6748d79cf-config\") pod \"dnsmasq-dns-666b6646f7-6mxcw\" (UID: \"5792dc4b-6da9-4a1b-9450-62d6748d79cf\") " pod="openstack/dnsmasq-dns-666b6646f7-6mxcw" Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.534390 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5792dc4b-6da9-4a1b-9450-62d6748d79cf-config\") pod \"dnsmasq-dns-666b6646f7-6mxcw\" (UID: \"5792dc4b-6da9-4a1b-9450-62d6748d79cf\") " pod="openstack/dnsmasq-dns-666b6646f7-6mxcw" Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.534641 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5792dc4b-6da9-4a1b-9450-62d6748d79cf-dns-svc\") pod \"dnsmasq-dns-666b6646f7-6mxcw\" (UID: \"5792dc4b-6da9-4a1b-9450-62d6748d79cf\") " pod="openstack/dnsmasq-dns-666b6646f7-6mxcw" Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.556963 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pbpf\" (UniqueName: \"kubernetes.io/projected/5792dc4b-6da9-4a1b-9450-62d6748d79cf-kube-api-access-4pbpf\") pod \"dnsmasq-dns-666b6646f7-6mxcw\" (UID: \"5792dc4b-6da9-4a1b-9450-62d6748d79cf\") " pod="openstack/dnsmasq-dns-666b6646f7-6mxcw" Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.629467 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-l84vp"] Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.647862 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-6mxcw" Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.684317 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-qj5tr"] Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.697639 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-qj5tr" Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.701415 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-qj5tr"] Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.733259 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/978ab210-2b6f-468a-b2a6-4c1c80b5932e-config\") pod \"dnsmasq-dns-57d769cc4f-qj5tr\" (UID: \"978ab210-2b6f-468a-b2a6-4c1c80b5932e\") " pod="openstack/dnsmasq-dns-57d769cc4f-qj5tr" Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.733358 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb77z\" (UniqueName: \"kubernetes.io/projected/978ab210-2b6f-468a-b2a6-4c1c80b5932e-kube-api-access-jb77z\") pod \"dnsmasq-dns-57d769cc4f-qj5tr\" (UID: \"978ab210-2b6f-468a-b2a6-4c1c80b5932e\") " pod="openstack/dnsmasq-dns-57d769cc4f-qj5tr" Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.733382 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/978ab210-2b6f-468a-b2a6-4c1c80b5932e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-qj5tr\" (UID: \"978ab210-2b6f-468a-b2a6-4c1c80b5932e\") " pod="openstack/dnsmasq-dns-57d769cc4f-qj5tr" Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.834725 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb77z\" (UniqueName: \"kubernetes.io/projected/978ab210-2b6f-468a-b2a6-4c1c80b5932e-kube-api-access-jb77z\") pod \"dnsmasq-dns-57d769cc4f-qj5tr\" (UID: \"978ab210-2b6f-468a-b2a6-4c1c80b5932e\") " pod="openstack/dnsmasq-dns-57d769cc4f-qj5tr" Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.834768 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/978ab210-2b6f-468a-b2a6-4c1c80b5932e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-qj5tr\" (UID: \"978ab210-2b6f-468a-b2a6-4c1c80b5932e\") " pod="openstack/dnsmasq-dns-57d769cc4f-qj5tr" Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.834844 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/978ab210-2b6f-468a-b2a6-4c1c80b5932e-config\") pod \"dnsmasq-dns-57d769cc4f-qj5tr\" (UID: \"978ab210-2b6f-468a-b2a6-4c1c80b5932e\") " pod="openstack/dnsmasq-dns-57d769cc4f-qj5tr" Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.835640 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/978ab210-2b6f-468a-b2a6-4c1c80b5932e-config\") pod \"dnsmasq-dns-57d769cc4f-qj5tr\" (UID: \"978ab210-2b6f-468a-b2a6-4c1c80b5932e\") " pod="openstack/dnsmasq-dns-57d769cc4f-qj5tr" Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.836857 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/978ab210-2b6f-468a-b2a6-4c1c80b5932e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-qj5tr\" (UID: \"978ab210-2b6f-468a-b2a6-4c1c80b5932e\") " pod="openstack/dnsmasq-dns-57d769cc4f-qj5tr" Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.857970 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb77z\" (UniqueName: \"kubernetes.io/projected/978ab210-2b6f-468a-b2a6-4c1c80b5932e-kube-api-access-jb77z\") pod \"dnsmasq-dns-57d769cc4f-qj5tr\" (UID: \"978ab210-2b6f-468a-b2a6-4c1c80b5932e\") " pod="openstack/dnsmasq-dns-57d769cc4f-qj5tr" Mar 20 07:31:19 crc kubenswrapper[4749]: I0320 07:31:19.914769 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6mxcw"] Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.052831 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-qj5tr" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.299859 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.302008 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.308759 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.308849 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.309093 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-5vww7" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.309256 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.309457 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.310596 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.310640 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.315344 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.345049 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8db06e36-0b00-4157-9345-69449da3e85f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.345092 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8db06e36-0b00-4157-9345-69449da3e85f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.345115 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8db06e36-0b00-4157-9345-69449da3e85f-config-data\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.345132 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8db06e36-0b00-4157-9345-69449da3e85f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.345193 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8db06e36-0b00-4157-9345-69449da3e85f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.345220 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rjx7\" (UniqueName: \"kubernetes.io/projected/8db06e36-0b00-4157-9345-69449da3e85f-kube-api-access-9rjx7\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.345264 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8db06e36-0b00-4157-9345-69449da3e85f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.345302 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.345385 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8db06e36-0b00-4157-9345-69449da3e85f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.345438 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8db06e36-0b00-4157-9345-69449da3e85f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.345467 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8db06e36-0b00-4157-9345-69449da3e85f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.389157 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-6mxcw" event={"ID":"5792dc4b-6da9-4a1b-9450-62d6748d79cf","Type":"ContainerStarted","Data":"bee93880d50a4be0f4b1014ab709c0f86ee4c46494066b94de9164045c5dc3a0"} Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.446929 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8db06e36-0b00-4157-9345-69449da3e85f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.446985 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.447009 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8db06e36-0b00-4157-9345-69449da3e85f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.447037 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8db06e36-0b00-4157-9345-69449da3e85f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.447062 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8db06e36-0b00-4157-9345-69449da3e85f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.447092 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8db06e36-0b00-4157-9345-69449da3e85f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.447116 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8db06e36-0b00-4157-9345-69449da3e85f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.447142 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8db06e36-0b00-4157-9345-69449da3e85f-config-data\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.447163 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8db06e36-0b00-4157-9345-69449da3e85f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.447202 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8db06e36-0b00-4157-9345-69449da3e85f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.447235 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rjx7\" (UniqueName: \"kubernetes.io/projected/8db06e36-0b00-4157-9345-69449da3e85f-kube-api-access-9rjx7\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.447726 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.450430 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8db06e36-0b00-4157-9345-69449da3e85f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.451147 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8db06e36-0b00-4157-9345-69449da3e85f-config-data\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.451182 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8db06e36-0b00-4157-9345-69449da3e85f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.451246 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8db06e36-0b00-4157-9345-69449da3e85f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.451672 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8db06e36-0b00-4157-9345-69449da3e85f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.455801 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8db06e36-0b00-4157-9345-69449da3e85f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.456042 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8db06e36-0b00-4157-9345-69449da3e85f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.456084 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8db06e36-0b00-4157-9345-69449da3e85f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.456108 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8db06e36-0b00-4157-9345-69449da3e85f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.457472 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-qj5tr"] Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.464898 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rjx7\" (UniqueName: \"kubernetes.io/projected/8db06e36-0b00-4157-9345-69449da3e85f-kube-api-access-9rjx7\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: W0320 07:31:20.469814 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod978ab210_2b6f_468a_b2a6_4c1c80b5932e.slice/crio-e0551ea407906739757bf5dd242d29844549120d2ac53558f8f684a6214254f5 WatchSource:0}: Error finding container e0551ea407906739757bf5dd242d29844549120d2ac53558f8f684a6214254f5: Status 404 returned error can't find the container with id e0551ea407906739757bf5dd242d29844549120d2ac53558f8f684a6214254f5 Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.478008 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"8db06e36-0b00-4157-9345-69449da3e85f\") " pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.594701 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.596055 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.598019 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.598035 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.598424 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.598673 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.598830 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-d5lqz" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.598943 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.599165 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.607625 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.627351 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.652990 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8b9b402f-2d95-48f5-98d8-497d90956ba2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.653036 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.653078 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8b9b402f-2d95-48f5-98d8-497d90956ba2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.653105 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8b9b402f-2d95-48f5-98d8-497d90956ba2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.653118 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8b9b402f-2d95-48f5-98d8-497d90956ba2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.653138 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b9b402f-2d95-48f5-98d8-497d90956ba2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.653151 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8b9b402f-2d95-48f5-98d8-497d90956ba2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.653180 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8b9b402f-2d95-48f5-98d8-497d90956ba2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.653212 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8b9b402f-2d95-48f5-98d8-497d90956ba2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.653229 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8b9b402f-2d95-48f5-98d8-497d90956ba2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.653272 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crscr\" (UniqueName: \"kubernetes.io/projected/8b9b402f-2d95-48f5-98d8-497d90956ba2-kube-api-access-crscr\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.754603 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8b9b402f-2d95-48f5-98d8-497d90956ba2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.755059 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8b9b402f-2d95-48f5-98d8-497d90956ba2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.755104 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8b9b402f-2d95-48f5-98d8-497d90956ba2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.755126 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b9b402f-2d95-48f5-98d8-497d90956ba2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.755140 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8b9b402f-2d95-48f5-98d8-497d90956ba2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.755164 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8b9b402f-2d95-48f5-98d8-497d90956ba2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.755191 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8b9b402f-2d95-48f5-98d8-497d90956ba2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.755208 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8b9b402f-2d95-48f5-98d8-497d90956ba2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.755239 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crscr\" (UniqueName: \"kubernetes.io/projected/8b9b402f-2d95-48f5-98d8-497d90956ba2-kube-api-access-crscr\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.755317 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8b9b402f-2d95-48f5-98d8-497d90956ba2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.755346 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.755606 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.757729 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8b9b402f-2d95-48f5-98d8-497d90956ba2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.758093 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8b9b402f-2d95-48f5-98d8-497d90956ba2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.758113 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8b9b402f-2d95-48f5-98d8-497d90956ba2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.758325 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8b9b402f-2d95-48f5-98d8-497d90956ba2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.758740 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b9b402f-2d95-48f5-98d8-497d90956ba2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.758775 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8b9b402f-2d95-48f5-98d8-497d90956ba2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.763346 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8b9b402f-2d95-48f5-98d8-497d90956ba2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.763764 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8b9b402f-2d95-48f5-98d8-497d90956ba2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.774031 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8b9b402f-2d95-48f5-98d8-497d90956ba2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.777375 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crscr\" (UniqueName: \"kubernetes.io/projected/8b9b402f-2d95-48f5-98d8-497d90956ba2-kube-api-access-crscr\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.798939 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b9b402f-2d95-48f5-98d8-497d90956ba2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.892190 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 07:31:20 crc kubenswrapper[4749]: I0320 07:31:20.923078 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:31:20 crc kubenswrapper[4749]: W0320 07:31:20.930434 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8db06e36_0b00_4157_9345_69449da3e85f.slice/crio-37f27917463d139ea172983612759395e7c56c90e4dd71660126cefd2e88dfd5 WatchSource:0}: Error finding container 37f27917463d139ea172983612759395e7c56c90e4dd71660126cefd2e88dfd5: Status 404 returned error can't find the container with id 37f27917463d139ea172983612759395e7c56c90e4dd71660126cefd2e88dfd5 Mar 20 07:31:21 crc kubenswrapper[4749]: I0320 07:31:21.402215 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerStarted","Data":"37f27917463d139ea172983612759395e7c56c90e4dd71660126cefd2e88dfd5"} Mar 20 07:31:21 crc kubenswrapper[4749]: I0320 07:31:21.404580 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-qj5tr" event={"ID":"978ab210-2b6f-468a-b2a6-4c1c80b5932e","Type":"ContainerStarted","Data":"e0551ea407906739757bf5dd242d29844549120d2ac53558f8f684a6214254f5"} Mar 20 07:31:21 crc kubenswrapper[4749]: I0320 07:31:21.555246 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.210073 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.211160 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.212054 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.221802 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.222046 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-srg6r" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.223562 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.224227 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.225813 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.390720 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c96afef-fa85-45f2-89cd-2fb2db26b9f8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") " pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.391364 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1c96afef-fa85-45f2-89cd-2fb2db26b9f8-kolla-config\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") " pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.391437 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c96afef-fa85-45f2-89cd-2fb2db26b9f8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") " pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.391469 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c96afef-fa85-45f2-89cd-2fb2db26b9f8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") " pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.391572 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1c96afef-fa85-45f2-89cd-2fb2db26b9f8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") " pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.391601 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1c96afef-fa85-45f2-89cd-2fb2db26b9f8-config-data-default\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") " pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.391754 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") " pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.392223 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crg2c\" (UniqueName: \"kubernetes.io/projected/1c96afef-fa85-45f2-89cd-2fb2db26b9f8-kube-api-access-crg2c\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") " pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.494771 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1c96afef-fa85-45f2-89cd-2fb2db26b9f8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") " pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.494830 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1c96afef-fa85-45f2-89cd-2fb2db26b9f8-config-data-default\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") " pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.494870 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") " pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.495266 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crg2c\" (UniqueName: \"kubernetes.io/projected/1c96afef-fa85-45f2-89cd-2fb2db26b9f8-kube-api-access-crg2c\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") " pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.495387 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c96afef-fa85-45f2-89cd-2fb2db26b9f8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") " pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.495428 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1c96afef-fa85-45f2-89cd-2fb2db26b9f8-kolla-config\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") " pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.495455 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c96afef-fa85-45f2-89cd-2fb2db26b9f8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") " pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.495482 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c96afef-fa85-45f2-89cd-2fb2db26b9f8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") " pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.496935 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.497351 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1c96afef-fa85-45f2-89cd-2fb2db26b9f8-config-data-default\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") " pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.497418 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1c96afef-fa85-45f2-89cd-2fb2db26b9f8-kolla-config\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") " pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.497589 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1c96afef-fa85-45f2-89cd-2fb2db26b9f8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") " pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.500962 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c96afef-fa85-45f2-89cd-2fb2db26b9f8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") " pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.502438 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c96afef-fa85-45f2-89cd-2fb2db26b9f8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") " pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.504886 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c96afef-fa85-45f2-89cd-2fb2db26b9f8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") " pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.519548 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crg2c\" (UniqueName: \"kubernetes.io/projected/1c96afef-fa85-45f2-89cd-2fb2db26b9f8-kube-api-access-crg2c\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") " pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.528615 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"1c96afef-fa85-45f2-89cd-2fb2db26b9f8\") " pod="openstack/openstack-galera-0" Mar 20 07:31:22 crc kubenswrapper[4749]: I0320 07:31:22.545173 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.498956 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.501200 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.503404 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.503612 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-kq5c8" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.504051 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.504236 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.505890 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.609549 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.609621 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8052bc33-6f6a-437e-9df5-508256f7e32f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.609660 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8052bc33-6f6a-437e-9df5-508256f7e32f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.609677 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8052bc33-6f6a-437e-9df5-508256f7e32f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.609700 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8052bc33-6f6a-437e-9df5-508256f7e32f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.609718 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8052bc33-6f6a-437e-9df5-508256f7e32f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.609735 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbn46\" (UniqueName: \"kubernetes.io/projected/8052bc33-6f6a-437e-9df5-508256f7e32f-kube-api-access-pbn46\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.609752 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8052bc33-6f6a-437e-9df5-508256f7e32f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.710986 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.711250 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.717534 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8052bc33-6f6a-437e-9df5-508256f7e32f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.717686 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8052bc33-6f6a-437e-9df5-508256f7e32f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.717718 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8052bc33-6f6a-437e-9df5-508256f7e32f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.717757 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8052bc33-6f6a-437e-9df5-508256f7e32f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.717797 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8052bc33-6f6a-437e-9df5-508256f7e32f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.717823 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbn46\" (UniqueName: \"kubernetes.io/projected/8052bc33-6f6a-437e-9df5-508256f7e32f-kube-api-access-pbn46\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.717850 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8052bc33-6f6a-437e-9df5-508256f7e32f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.718685 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8052bc33-6f6a-437e-9df5-508256f7e32f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.718755 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8052bc33-6f6a-437e-9df5-508256f7e32f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.718946 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8052bc33-6f6a-437e-9df5-508256f7e32f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.719682 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8052bc33-6f6a-437e-9df5-508256f7e32f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.727304 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8052bc33-6f6a-437e-9df5-508256f7e32f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.735268 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8052bc33-6f6a-437e-9df5-508256f7e32f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.735499 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbn46\" (UniqueName: \"kubernetes.io/projected/8052bc33-6f6a-437e-9df5-508256f7e32f-kube-api-access-pbn46\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.746088 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8052bc33-6f6a-437e-9df5-508256f7e32f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.786752 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.787903 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.793837 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-l5skr" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.799766 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.799886 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.825044 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.827171 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.919749 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056-kolla-config\") pod \"memcached-0\" (UID: \"e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056\") " pod="openstack/memcached-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.919829 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056\") " pod="openstack/memcached-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.919904 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjg5r\" (UniqueName: \"kubernetes.io/projected/e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056-kube-api-access-mjg5r\") pod \"memcached-0\" (UID: \"e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056\") " pod="openstack/memcached-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.919947 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056\") " pod="openstack/memcached-0" Mar 20 07:31:23 crc kubenswrapper[4749]: I0320 07:31:23.919969 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056-config-data\") pod \"memcached-0\" (UID: \"e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056\") " pod="openstack/memcached-0" Mar 20 07:31:24 crc kubenswrapper[4749]: I0320 07:31:24.021733 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056\") " pod="openstack/memcached-0" Mar 20 07:31:24 crc kubenswrapper[4749]: I0320 07:31:24.021833 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjg5r\" (UniqueName: \"kubernetes.io/projected/e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056-kube-api-access-mjg5r\") pod \"memcached-0\" (UID: \"e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056\") " pod="openstack/memcached-0" Mar 20 07:31:24 crc kubenswrapper[4749]: I0320 07:31:24.021882 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056\") " pod="openstack/memcached-0" Mar 20 07:31:24 crc kubenswrapper[4749]: I0320 07:31:24.021905 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056-config-data\") pod \"memcached-0\" (UID: \"e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056\") " pod="openstack/memcached-0" Mar 20 07:31:24 crc kubenswrapper[4749]: I0320 07:31:24.021966 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056-kolla-config\") pod \"memcached-0\" (UID: \"e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056\") " pod="openstack/memcached-0" Mar 20 07:31:24 crc kubenswrapper[4749]: I0320 07:31:24.024764 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056-kolla-config\") pod \"memcached-0\" (UID: \"e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056\") " pod="openstack/memcached-0" Mar 20 07:31:24 crc kubenswrapper[4749]: I0320 07:31:24.027723 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056\") " pod="openstack/memcached-0" Mar 20 07:31:24 crc kubenswrapper[4749]: I0320 07:31:24.028406 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056-config-data\") pod \"memcached-0\" (UID: \"e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056\") " pod="openstack/memcached-0" Mar 20 07:31:24 crc kubenswrapper[4749]: I0320 07:31:24.029015 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056\") " pod="openstack/memcached-0" Mar 20 07:31:24 crc kubenswrapper[4749]: I0320 07:31:24.040929 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjg5r\" (UniqueName: \"kubernetes.io/projected/e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056-kube-api-access-mjg5r\") pod \"memcached-0\" (UID: \"e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056\") " pod="openstack/memcached-0" Mar 20 07:31:24 crc kubenswrapper[4749]: I0320 07:31:24.123828 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 07:31:25 crc kubenswrapper[4749]: I0320 07:31:25.438029 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerStarted","Data":"544692a0d30bd764b76ac83d2e522a85025e81947baa3d59efc7dbd181f7ee32"} Mar 20 07:31:26 crc kubenswrapper[4749]: I0320 07:31:26.090451 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 07:31:26 crc kubenswrapper[4749]: I0320 07:31:26.091724 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 07:31:26 crc kubenswrapper[4749]: I0320 07:31:26.094355 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-sgg8m" Mar 20 07:31:26 crc kubenswrapper[4749]: I0320 07:31:26.103594 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 07:31:26 crc kubenswrapper[4749]: I0320 07:31:26.268014 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnjlq\" (UniqueName: \"kubernetes.io/projected/b364d204-dcba-4b43-98e1-f1e22bd89b2c-kube-api-access-jnjlq\") pod \"kube-state-metrics-0\" (UID: \"b364d204-dcba-4b43-98e1-f1e22bd89b2c\") " pod="openstack/kube-state-metrics-0" Mar 20 07:31:26 crc kubenswrapper[4749]: I0320 07:31:26.369960 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnjlq\" (UniqueName: \"kubernetes.io/projected/b364d204-dcba-4b43-98e1-f1e22bd89b2c-kube-api-access-jnjlq\") pod \"kube-state-metrics-0\" (UID: \"b364d204-dcba-4b43-98e1-f1e22bd89b2c\") " pod="openstack/kube-state-metrics-0" Mar 20 07:31:26 crc kubenswrapper[4749]: I0320 07:31:26.403349 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnjlq\" (UniqueName: \"kubernetes.io/projected/b364d204-dcba-4b43-98e1-f1e22bd89b2c-kube-api-access-jnjlq\") pod \"kube-state-metrics-0\" (UID: \"b364d204-dcba-4b43-98e1-f1e22bd89b2c\") " pod="openstack/kube-state-metrics-0" Mar 20 07:31:26 crc kubenswrapper[4749]: I0320 07:31:26.458437 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.308111 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-tx9bw"] Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.309578 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tx9bw" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.312810 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.313222 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.313395 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-zw52x" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.324812 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-kvqdd"] Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.326805 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-kvqdd" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.332859 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-tx9bw"] Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.359847 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-kvqdd"] Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.417406 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/32ceaa95-18d9-4f1e-9ebd-f2d413709413-var-log-ovn\") pod \"ovn-controller-tx9bw\" (UID: \"32ceaa95-18d9-4f1e-9ebd-f2d413709413\") " pod="openstack/ovn-controller-tx9bw" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.417446 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32ceaa95-18d9-4f1e-9ebd-f2d413709413-scripts\") pod \"ovn-controller-tx9bw\" (UID: \"32ceaa95-18d9-4f1e-9ebd-f2d413709413\") " pod="openstack/ovn-controller-tx9bw" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.417462 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fq2g\" (UniqueName: \"kubernetes.io/projected/32ceaa95-18d9-4f1e-9ebd-f2d413709413-kube-api-access-9fq2g\") pod \"ovn-controller-tx9bw\" (UID: \"32ceaa95-18d9-4f1e-9ebd-f2d413709413\") " pod="openstack/ovn-controller-tx9bw" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.417487 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwnzc\" (UniqueName: \"kubernetes.io/projected/d72e69d0-23f3-4d14-ab35-74ea19e79b69-kube-api-access-gwnzc\") pod \"ovn-controller-ovs-kvqdd\" (UID: \"d72e69d0-23f3-4d14-ab35-74ea19e79b69\") " pod="openstack/ovn-controller-ovs-kvqdd" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.417578 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/32ceaa95-18d9-4f1e-9ebd-f2d413709413-ovn-controller-tls-certs\") pod \"ovn-controller-tx9bw\" (UID: \"32ceaa95-18d9-4f1e-9ebd-f2d413709413\") " pod="openstack/ovn-controller-tx9bw" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.417628 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/32ceaa95-18d9-4f1e-9ebd-f2d413709413-var-run\") pod \"ovn-controller-tx9bw\" (UID: \"32ceaa95-18d9-4f1e-9ebd-f2d413709413\") " pod="openstack/ovn-controller-tx9bw" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.417656 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d72e69d0-23f3-4d14-ab35-74ea19e79b69-var-run\") pod \"ovn-controller-ovs-kvqdd\" (UID: \"d72e69d0-23f3-4d14-ab35-74ea19e79b69\") " pod="openstack/ovn-controller-ovs-kvqdd" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.417688 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/32ceaa95-18d9-4f1e-9ebd-f2d413709413-var-run-ovn\") pod \"ovn-controller-tx9bw\" (UID: \"32ceaa95-18d9-4f1e-9ebd-f2d413709413\") " pod="openstack/ovn-controller-tx9bw" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.417711 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d72e69d0-23f3-4d14-ab35-74ea19e79b69-var-log\") pod \"ovn-controller-ovs-kvqdd\" (UID: \"d72e69d0-23f3-4d14-ab35-74ea19e79b69\") " pod="openstack/ovn-controller-ovs-kvqdd" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.417728 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d72e69d0-23f3-4d14-ab35-74ea19e79b69-scripts\") pod \"ovn-controller-ovs-kvqdd\" (UID: \"d72e69d0-23f3-4d14-ab35-74ea19e79b69\") " pod="openstack/ovn-controller-ovs-kvqdd" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.417897 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ceaa95-18d9-4f1e-9ebd-f2d413709413-combined-ca-bundle\") pod \"ovn-controller-tx9bw\" (UID: \"32ceaa95-18d9-4f1e-9ebd-f2d413709413\") " pod="openstack/ovn-controller-tx9bw" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.417966 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d72e69d0-23f3-4d14-ab35-74ea19e79b69-var-lib\") pod \"ovn-controller-ovs-kvqdd\" (UID: \"d72e69d0-23f3-4d14-ab35-74ea19e79b69\") " pod="openstack/ovn-controller-ovs-kvqdd" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.418023 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d72e69d0-23f3-4d14-ab35-74ea19e79b69-etc-ovs\") pod \"ovn-controller-ovs-kvqdd\" (UID: \"d72e69d0-23f3-4d14-ab35-74ea19e79b69\") " pod="openstack/ovn-controller-ovs-kvqdd" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.519798 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ceaa95-18d9-4f1e-9ebd-f2d413709413-combined-ca-bundle\") pod \"ovn-controller-tx9bw\" (UID: \"32ceaa95-18d9-4f1e-9ebd-f2d413709413\") " pod="openstack/ovn-controller-tx9bw" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.519843 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d72e69d0-23f3-4d14-ab35-74ea19e79b69-var-lib\") pod \"ovn-controller-ovs-kvqdd\" (UID: \"d72e69d0-23f3-4d14-ab35-74ea19e79b69\") " pod="openstack/ovn-controller-ovs-kvqdd" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.519878 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d72e69d0-23f3-4d14-ab35-74ea19e79b69-etc-ovs\") pod \"ovn-controller-ovs-kvqdd\" (UID: \"d72e69d0-23f3-4d14-ab35-74ea19e79b69\") " pod="openstack/ovn-controller-ovs-kvqdd" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.519912 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/32ceaa95-18d9-4f1e-9ebd-f2d413709413-var-log-ovn\") pod \"ovn-controller-tx9bw\" (UID: \"32ceaa95-18d9-4f1e-9ebd-f2d413709413\") " pod="openstack/ovn-controller-tx9bw" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.519926 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32ceaa95-18d9-4f1e-9ebd-f2d413709413-scripts\") pod \"ovn-controller-tx9bw\" (UID: \"32ceaa95-18d9-4f1e-9ebd-f2d413709413\") " pod="openstack/ovn-controller-tx9bw" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.519942 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fq2g\" (UniqueName: \"kubernetes.io/projected/32ceaa95-18d9-4f1e-9ebd-f2d413709413-kube-api-access-9fq2g\") pod \"ovn-controller-tx9bw\" (UID: \"32ceaa95-18d9-4f1e-9ebd-f2d413709413\") " pod="openstack/ovn-controller-tx9bw" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.519961 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwnzc\" (UniqueName: \"kubernetes.io/projected/d72e69d0-23f3-4d14-ab35-74ea19e79b69-kube-api-access-gwnzc\") pod \"ovn-controller-ovs-kvqdd\" (UID: \"d72e69d0-23f3-4d14-ab35-74ea19e79b69\") " pod="openstack/ovn-controller-ovs-kvqdd" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.519983 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/32ceaa95-18d9-4f1e-9ebd-f2d413709413-ovn-controller-tls-certs\") pod \"ovn-controller-tx9bw\" (UID: \"32ceaa95-18d9-4f1e-9ebd-f2d413709413\") " pod="openstack/ovn-controller-tx9bw" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.520003 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/32ceaa95-18d9-4f1e-9ebd-f2d413709413-var-run\") pod \"ovn-controller-tx9bw\" (UID: \"32ceaa95-18d9-4f1e-9ebd-f2d413709413\") " pod="openstack/ovn-controller-tx9bw" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.520023 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d72e69d0-23f3-4d14-ab35-74ea19e79b69-var-run\") pod \"ovn-controller-ovs-kvqdd\" (UID: \"d72e69d0-23f3-4d14-ab35-74ea19e79b69\") " pod="openstack/ovn-controller-ovs-kvqdd" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.520061 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/32ceaa95-18d9-4f1e-9ebd-f2d413709413-var-run-ovn\") pod \"ovn-controller-tx9bw\" (UID: \"32ceaa95-18d9-4f1e-9ebd-f2d413709413\") " pod="openstack/ovn-controller-tx9bw" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.520085 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d72e69d0-23f3-4d14-ab35-74ea19e79b69-var-log\") pod \"ovn-controller-ovs-kvqdd\" (UID: \"d72e69d0-23f3-4d14-ab35-74ea19e79b69\") " pod="openstack/ovn-controller-ovs-kvqdd" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.520102 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d72e69d0-23f3-4d14-ab35-74ea19e79b69-scripts\") pod \"ovn-controller-ovs-kvqdd\" (UID: \"d72e69d0-23f3-4d14-ab35-74ea19e79b69\") " pod="openstack/ovn-controller-ovs-kvqdd" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.520439 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/32ceaa95-18d9-4f1e-9ebd-f2d413709413-var-log-ovn\") pod \"ovn-controller-tx9bw\" (UID: \"32ceaa95-18d9-4f1e-9ebd-f2d413709413\") " pod="openstack/ovn-controller-tx9bw" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.520554 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d72e69d0-23f3-4d14-ab35-74ea19e79b69-var-lib\") pod \"ovn-controller-ovs-kvqdd\" (UID: \"d72e69d0-23f3-4d14-ab35-74ea19e79b69\") " pod="openstack/ovn-controller-ovs-kvqdd" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.520656 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d72e69d0-23f3-4d14-ab35-74ea19e79b69-etc-ovs\") pod \"ovn-controller-ovs-kvqdd\" (UID: \"d72e69d0-23f3-4d14-ab35-74ea19e79b69\") " pod="openstack/ovn-controller-ovs-kvqdd" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.521314 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d72e69d0-23f3-4d14-ab35-74ea19e79b69-var-run\") pod \"ovn-controller-ovs-kvqdd\" (UID: \"d72e69d0-23f3-4d14-ab35-74ea19e79b69\") " pod="openstack/ovn-controller-ovs-kvqdd" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.521334 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d72e69d0-23f3-4d14-ab35-74ea19e79b69-var-log\") pod \"ovn-controller-ovs-kvqdd\" (UID: \"d72e69d0-23f3-4d14-ab35-74ea19e79b69\") " pod="openstack/ovn-controller-ovs-kvqdd" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.521334 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/32ceaa95-18d9-4f1e-9ebd-f2d413709413-var-run-ovn\") pod \"ovn-controller-tx9bw\" (UID: \"32ceaa95-18d9-4f1e-9ebd-f2d413709413\") " pod="openstack/ovn-controller-tx9bw" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.521525 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/32ceaa95-18d9-4f1e-9ebd-f2d413709413-var-run\") pod \"ovn-controller-tx9bw\" (UID: \"32ceaa95-18d9-4f1e-9ebd-f2d413709413\") " pod="openstack/ovn-controller-tx9bw" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.522845 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d72e69d0-23f3-4d14-ab35-74ea19e79b69-scripts\") pod \"ovn-controller-ovs-kvqdd\" (UID: \"d72e69d0-23f3-4d14-ab35-74ea19e79b69\") " pod="openstack/ovn-controller-ovs-kvqdd" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.525980 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32ceaa95-18d9-4f1e-9ebd-f2d413709413-scripts\") pod \"ovn-controller-tx9bw\" (UID: \"32ceaa95-18d9-4f1e-9ebd-f2d413709413\") " pod="openstack/ovn-controller-tx9bw" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.526880 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/32ceaa95-18d9-4f1e-9ebd-f2d413709413-ovn-controller-tls-certs\") pod \"ovn-controller-tx9bw\" (UID: \"32ceaa95-18d9-4f1e-9ebd-f2d413709413\") " pod="openstack/ovn-controller-tx9bw" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.527984 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ceaa95-18d9-4f1e-9ebd-f2d413709413-combined-ca-bundle\") pod \"ovn-controller-tx9bw\" (UID: \"32ceaa95-18d9-4f1e-9ebd-f2d413709413\") " pod="openstack/ovn-controller-tx9bw" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.539984 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwnzc\" (UniqueName: \"kubernetes.io/projected/d72e69d0-23f3-4d14-ab35-74ea19e79b69-kube-api-access-gwnzc\") pod \"ovn-controller-ovs-kvqdd\" (UID: \"d72e69d0-23f3-4d14-ab35-74ea19e79b69\") " pod="openstack/ovn-controller-ovs-kvqdd" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.551202 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fq2g\" (UniqueName: \"kubernetes.io/projected/32ceaa95-18d9-4f1e-9ebd-f2d413709413-kube-api-access-9fq2g\") pod \"ovn-controller-tx9bw\" (UID: \"32ceaa95-18d9-4f1e-9ebd-f2d413709413\") " pod="openstack/ovn-controller-tx9bw" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.636864 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tx9bw" Mar 20 07:31:29 crc kubenswrapper[4749]: I0320 07:31:29.684691 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-kvqdd" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.197821 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.199298 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.200979 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-nbm89" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.201271 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.210352 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.210422 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.210447 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.231877 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.329700 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3723f09-8ae3-44e2-b5c7-7824e62755f7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.329759 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a3723f09-8ae3-44e2-b5c7-7824e62755f7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.329781 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3723f09-8ae3-44e2-b5c7-7824e62755f7-config\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.329954 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3723f09-8ae3-44e2-b5c7-7824e62755f7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.330026 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qbrx\" (UniqueName: \"kubernetes.io/projected/a3723f09-8ae3-44e2-b5c7-7824e62755f7-kube-api-access-6qbrx\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.330121 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3723f09-8ae3-44e2-b5c7-7824e62755f7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.330198 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3723f09-8ae3-44e2-b5c7-7824e62755f7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.330220 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.432038 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3723f09-8ae3-44e2-b5c7-7824e62755f7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.432136 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3723f09-8ae3-44e2-b5c7-7824e62755f7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.432177 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.432269 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3723f09-8ae3-44e2-b5c7-7824e62755f7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.432360 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a3723f09-8ae3-44e2-b5c7-7824e62755f7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.432391 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3723f09-8ae3-44e2-b5c7-7824e62755f7-config\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.432456 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3723f09-8ae3-44e2-b5c7-7824e62755f7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.432511 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qbrx\" (UniqueName: \"kubernetes.io/projected/a3723f09-8ae3-44e2-b5c7-7824e62755f7-kube-api-access-6qbrx\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.433411 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3723f09-8ae3-44e2-b5c7-7824e62755f7-config\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.433660 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.433783 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a3723f09-8ae3-44e2-b5c7-7824e62755f7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.435016 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3723f09-8ae3-44e2-b5c7-7824e62755f7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.440356 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3723f09-8ae3-44e2-b5c7-7824e62755f7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.440407 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3723f09-8ae3-44e2-b5c7-7824e62755f7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.449204 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3723f09-8ae3-44e2-b5c7-7824e62755f7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.466268 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qbrx\" (UniqueName: \"kubernetes.io/projected/a3723f09-8ae3-44e2-b5c7-7824e62755f7-kube-api-access-6qbrx\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.481357 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a3723f09-8ae3-44e2-b5c7-7824e62755f7\") " pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:30 crc kubenswrapper[4749]: I0320 07:31:30.538181 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.724542 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.725862 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.732413 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-s7v8c" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.732605 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.732756 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.732905 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.741354 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.766790 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/541ae4e0-d6e7-4ad9-8451-8f5b840050de-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.766843 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/541ae4e0-d6e7-4ad9-8451-8f5b840050de-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.766875 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.766912 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/541ae4e0-d6e7-4ad9-8451-8f5b840050de-config\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.767017 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/541ae4e0-d6e7-4ad9-8451-8f5b840050de-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.767158 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541ae4e0-d6e7-4ad9-8451-8f5b840050de-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.767408 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/541ae4e0-d6e7-4ad9-8451-8f5b840050de-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.767515 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsp82\" (UniqueName: \"kubernetes.io/projected/541ae4e0-d6e7-4ad9-8451-8f5b840050de-kube-api-access-hsp82\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.869036 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.869090 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/541ae4e0-d6e7-4ad9-8451-8f5b840050de-config\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.869129 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/541ae4e0-d6e7-4ad9-8451-8f5b840050de-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.869156 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541ae4e0-d6e7-4ad9-8451-8f5b840050de-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.869227 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/541ae4e0-d6e7-4ad9-8451-8f5b840050de-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.869260 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsp82\" (UniqueName: \"kubernetes.io/projected/541ae4e0-d6e7-4ad9-8451-8f5b840050de-kube-api-access-hsp82\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.869321 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/541ae4e0-d6e7-4ad9-8451-8f5b840050de-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.870054 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/541ae4e0-d6e7-4ad9-8451-8f5b840050de-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.870273 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.870662 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/541ae4e0-d6e7-4ad9-8451-8f5b840050de-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.871073 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/541ae4e0-d6e7-4ad9-8451-8f5b840050de-config\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.874703 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/541ae4e0-d6e7-4ad9-8451-8f5b840050de-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.879677 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541ae4e0-d6e7-4ad9-8451-8f5b840050de-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.879952 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/541ae4e0-d6e7-4ad9-8451-8f5b840050de-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.887622 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsp82\" (UniqueName: \"kubernetes.io/projected/541ae4e0-d6e7-4ad9-8451-8f5b840050de-kube-api-access-hsp82\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.890825 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/541ae4e0-d6e7-4ad9-8451-8f5b840050de-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:32 crc kubenswrapper[4749]: I0320 07:31:32.903370 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"541ae4e0-d6e7-4ad9-8451-8f5b840050de\") " pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:33 crc kubenswrapper[4749]: I0320 07:31:33.064852 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:34 crc kubenswrapper[4749]: I0320 07:31:34.515136 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:31:34 crc kubenswrapper[4749]: I0320 07:31:34.515211 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:31:35 crc kubenswrapper[4749]: E0320 07:31:35.972418 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Mar 20 07:31:35 crc kubenswrapper[4749]: E0320 07:31:35.973136 4749 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 07:31:35 crc kubenswrapper[4749]: init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c],Args:[set -e Mar 20 07:31:35 crc kubenswrapper[4749]: cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie Mar 20 07:31:35 crc kubenswrapper[4749]: chmod 600 /var/lib/rabbitmq/.erlang.cookie Mar 20 07:31:35 crc kubenswrapper[4749]: cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins Mar 20 07:31:35 crc kubenswrapper[4749]: echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 07:31:35 crc kubenswrapper[4749]: sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 07:31:35 crc kubenswrapper[4749]: chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 07:31:35 crc kubenswrapper[4749]: # Allow time for multi-pod clusters to complete peer discovery Mar 20 07:31:35 crc kubenswrapper[4749]: sleep 30],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9rjx7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 20 07:31:35 crc kubenswrapper[4749]: > logger="UnhandledError" Mar 20 07:31:35 crc kubenswrapper[4749]: E0320 07:31:35.974341 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:31:36 crc kubenswrapper[4749]: E0320 07:31:36.532820 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:31:36 crc kubenswrapper[4749]: E0320 07:31:36.905816 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 07:31:36 crc kubenswrapper[4749]: E0320 07:31:36.905967 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rcfhj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-l84vp_openstack(1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 07:31:36 crc kubenswrapper[4749]: E0320 07:31:36.907150 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-l84vp" podUID="1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b" Mar 20 07:31:36 crc kubenswrapper[4749]: E0320 07:31:36.934803 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 07:31:36 crc kubenswrapper[4749]: E0320 07:31:36.934989 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c5qml,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-cv4rz_openstack(87d010cf-d0a5-4c30-b14b-6f81e05e6ec4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 07:31:36 crc kubenswrapper[4749]: E0320 07:31:36.936723 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-cv4rz" podUID="87d010cf-d0a5-4c30-b14b-6f81e05e6ec4" Mar 20 07:31:36 crc kubenswrapper[4749]: E0320 07:31:36.942157 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 07:31:36 crc kubenswrapper[4749]: E0320 07:31:36.942409 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4pbpf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-6mxcw_openstack(5792dc4b-6da9-4a1b-9450-62d6748d79cf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 07:31:36 crc kubenswrapper[4749]: E0320 07:31:36.943613 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-6mxcw" podUID="5792dc4b-6da9-4a1b-9450-62d6748d79cf" Mar 20 07:31:36 crc kubenswrapper[4749]: E0320 07:31:36.992750 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 07:31:36 crc kubenswrapper[4749]: E0320 07:31:36.992939 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jb77z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-qj5tr_openstack(978ab210-2b6f-468a-b2a6-4c1c80b5932e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 07:31:36 crc kubenswrapper[4749]: E0320 07:31:36.994343 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-qj5tr" podUID="978ab210-2b6f-468a-b2a6-4c1c80b5932e" Mar 20 07:31:37 crc kubenswrapper[4749]: I0320 07:31:37.414778 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 07:31:37 crc kubenswrapper[4749]: W0320 07:31:37.507902 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8fd1b2c_620b_44a6_b4f0_1c4d2cbda056.slice/crio-32b24a697b53a4c39cf4d3099b9018f3fea7dbd3b473974720787f2e1438d143 WatchSource:0}: Error finding container 32b24a697b53a4c39cf4d3099b9018f3fea7dbd3b473974720787f2e1438d143: Status 404 returned error can't find the container with id 32b24a697b53a4c39cf4d3099b9018f3fea7dbd3b473974720787f2e1438d143 Mar 20 07:31:37 crc kubenswrapper[4749]: I0320 07:31:37.507954 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 07:31:37 crc kubenswrapper[4749]: W0320 07:31:37.511951 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c96afef_fa85_45f2_89cd_2fb2db26b9f8.slice/crio-c81fa474eb0440455a42452a23952697ae44c54fe0b8ef87bb8328f69a51c491 WatchSource:0}: Error finding container c81fa474eb0440455a42452a23952697ae44c54fe0b8ef87bb8328f69a51c491: Status 404 returned error can't find the container with id c81fa474eb0440455a42452a23952697ae44c54fe0b8ef87bb8328f69a51c491 Mar 20 07:31:37 crc kubenswrapper[4749]: I0320 07:31:37.520908 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 07:31:37 crc kubenswrapper[4749]: I0320 07:31:37.533655 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1c96afef-fa85-45f2-89cd-2fb2db26b9f8","Type":"ContainerStarted","Data":"c81fa474eb0440455a42452a23952697ae44c54fe0b8ef87bb8328f69a51c491"} Mar 20 07:31:37 crc kubenswrapper[4749]: I0320 07:31:37.534785 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056","Type":"ContainerStarted","Data":"32b24a697b53a4c39cf4d3099b9018f3fea7dbd3b473974720787f2e1438d143"} Mar 20 07:31:37 crc kubenswrapper[4749]: I0320 07:31:37.536156 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8052bc33-6f6a-437e-9df5-508256f7e32f","Type":"ContainerStarted","Data":"f26399b1fc3caef51022fda97aac7ef17b6c92256f1f441ac3003e509b370a90"} Mar 20 07:31:37 crc kubenswrapper[4749]: E0320 07:31:37.538160 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-qj5tr" podUID="978ab210-2b6f-468a-b2a6-4c1c80b5932e" Mar 20 07:31:37 crc kubenswrapper[4749]: E0320 07:31:37.538363 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-6mxcw" podUID="5792dc4b-6da9-4a1b-9450-62d6748d79cf" Mar 20 07:31:37 crc kubenswrapper[4749]: I0320 07:31:37.638699 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-tx9bw"] Mar 20 07:31:37 crc kubenswrapper[4749]: I0320 07:31:37.648171 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 07:31:37 crc kubenswrapper[4749]: W0320 07:31:37.668875 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32ceaa95_18d9_4f1e_9ebd_f2d413709413.slice/crio-5e567de11cf63dc216346e7a3230e887b310244cc700b85e5596c8025f181ca2 WatchSource:0}: Error finding container 5e567de11cf63dc216346e7a3230e887b310244cc700b85e5596c8025f181ca2: Status 404 returned error can't find the container with id 5e567de11cf63dc216346e7a3230e887b310244cc700b85e5596c8025f181ca2 Mar 20 07:31:37 crc kubenswrapper[4749]: I0320 07:31:37.718781 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 07:31:37 crc kubenswrapper[4749]: W0320 07:31:37.729932 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3723f09_8ae3_44e2_b5c7_7824e62755f7.slice/crio-879f6e005eb6b9f28733ef346932dfc1908140ffe29b41e20c7e2ba36dad8d02 WatchSource:0}: Error finding container 879f6e005eb6b9f28733ef346932dfc1908140ffe29b41e20c7e2ba36dad8d02: Status 404 returned error can't find the container with id 879f6e005eb6b9f28733ef346932dfc1908140ffe29b41e20c7e2ba36dad8d02 Mar 20 07:31:37 crc kubenswrapper[4749]: I0320 07:31:37.846565 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-kvqdd"] Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.090949 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-l84vp" Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.098046 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-cv4rz" Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.267151 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b-dns-svc\") pod \"1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b\" (UID: \"1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b\") " Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.267411 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b-config\") pod \"1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b\" (UID: \"1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b\") " Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.267475 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcfhj\" (UniqueName: \"kubernetes.io/projected/1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b-kube-api-access-rcfhj\") pod \"1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b\" (UID: \"1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b\") " Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.267519 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5qml\" (UniqueName: \"kubernetes.io/projected/87d010cf-d0a5-4c30-b14b-6f81e05e6ec4-kube-api-access-c5qml\") pod \"87d010cf-d0a5-4c30-b14b-6f81e05e6ec4\" (UID: \"87d010cf-d0a5-4c30-b14b-6f81e05e6ec4\") " Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.267604 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d010cf-d0a5-4c30-b14b-6f81e05e6ec4-config\") pod \"87d010cf-d0a5-4c30-b14b-6f81e05e6ec4\" (UID: \"87d010cf-d0a5-4c30-b14b-6f81e05e6ec4\") " Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.268260 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b" (UID: "1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.268873 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b-config" (OuterVolumeSpecName: "config") pod "1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b" (UID: "1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.269607 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.269663 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.269797 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87d010cf-d0a5-4c30-b14b-6f81e05e6ec4-config" (OuterVolumeSpecName: "config") pod "87d010cf-d0a5-4c30-b14b-6f81e05e6ec4" (UID: "87d010cf-d0a5-4c30-b14b-6f81e05e6ec4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.274492 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87d010cf-d0a5-4c30-b14b-6f81e05e6ec4-kube-api-access-c5qml" (OuterVolumeSpecName: "kube-api-access-c5qml") pod "87d010cf-d0a5-4c30-b14b-6f81e05e6ec4" (UID: "87d010cf-d0a5-4c30-b14b-6f81e05e6ec4"). InnerVolumeSpecName "kube-api-access-c5qml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.274875 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b-kube-api-access-rcfhj" (OuterVolumeSpecName: "kube-api-access-rcfhj") pod "1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b" (UID: "1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b"). InnerVolumeSpecName "kube-api-access-rcfhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.371903 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcfhj\" (UniqueName: \"kubernetes.io/projected/1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b-kube-api-access-rcfhj\") on node \"crc\" DevicePath \"\"" Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.372133 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5qml\" (UniqueName: \"kubernetes.io/projected/87d010cf-d0a5-4c30-b14b-6f81e05e6ec4-kube-api-access-c5qml\") on node \"crc\" DevicePath \"\"" Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.372165 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d010cf-d0a5-4c30-b14b-6f81e05e6ec4-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.549948 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-tx9bw" event={"ID":"32ceaa95-18d9-4f1e-9ebd-f2d413709413","Type":"ContainerStarted","Data":"5e567de11cf63dc216346e7a3230e887b310244cc700b85e5596c8025f181ca2"} Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.552471 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-cv4rz" Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.552476 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-cv4rz" event={"ID":"87d010cf-d0a5-4c30-b14b-6f81e05e6ec4","Type":"ContainerDied","Data":"d0e7884e9edb6ab98934d1d33e5918fa3498ea1558a9b77ff9a58e2bd943a8f0"} Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.559338 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-l84vp" event={"ID":"1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b","Type":"ContainerDied","Data":"189abc5a481a291d468b34caf0f5c4c24fdb135a54f7660c83cbddd342c0475a"} Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.559473 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-l84vp" Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.562416 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b364d204-dcba-4b43-98e1-f1e22bd89b2c","Type":"ContainerStarted","Data":"877edf12b787b953e70c38f88e00f7b5d95df6633ce265c592de936f5c39ecc9"} Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.563787 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a3723f09-8ae3-44e2-b5c7-7824e62755f7","Type":"ContainerStarted","Data":"879f6e005eb6b9f28733ef346932dfc1908140ffe29b41e20c7e2ba36dad8d02"} Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.566234 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kvqdd" event={"ID":"d72e69d0-23f3-4d14-ab35-74ea19e79b69","Type":"ContainerStarted","Data":"a9b8f5e8f5570351fc60f44a07ef2bcf62d5c1007d06d4b2263b71f0a89bfd89"} Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.611233 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cv4rz"] Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.626407 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cv4rz"] Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.643135 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-l84vp"] Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.650014 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-l84vp"] Mar 20 07:31:38 crc kubenswrapper[4749]: I0320 07:31:38.708960 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 07:31:39 crc kubenswrapper[4749]: I0320 07:31:39.579798 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"541ae4e0-d6e7-4ad9-8451-8f5b840050de","Type":"ContainerStarted","Data":"6598d779c2f5dcdbe552901444e5b134e799063923991932bf44bd715e5af80c"} Mar 20 07:31:40 crc kubenswrapper[4749]: I0320 07:31:40.196007 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b" path="/var/lib/kubelet/pods/1a4fd6b7-b0e1-4f80-ae5e-4c6933f87f9b/volumes" Mar 20 07:31:40 crc kubenswrapper[4749]: I0320 07:31:40.197073 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87d010cf-d0a5-4c30-b14b-6f81e05e6ec4" path="/var/lib/kubelet/pods/87d010cf-d0a5-4c30-b14b-6f81e05e6ec4/volumes" Mar 20 07:31:40 crc kubenswrapper[4749]: I0320 07:31:40.593935 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerStarted","Data":"8572c6a9460b80002b347994673a59cd6df57ba39c3cf1dc1f924436191cb2c3"} Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.100835 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-7xcsl"] Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.104627 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7xcsl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.108921 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.111300 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-7xcsl"] Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.220990 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-qj5tr"] Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.253934 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/fe444947-6938-4000-8de5-462c8d0a42aa-ovs-rundir\") pod \"ovn-controller-metrics-7xcsl\" (UID: \"fe444947-6938-4000-8de5-462c8d0a42aa\") " pod="openstack/ovn-controller-metrics-7xcsl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.254000 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lldq4\" (UniqueName: \"kubernetes.io/projected/fe444947-6938-4000-8de5-462c8d0a42aa-kube-api-access-lldq4\") pod \"ovn-controller-metrics-7xcsl\" (UID: \"fe444947-6938-4000-8de5-462c8d0a42aa\") " pod="openstack/ovn-controller-metrics-7xcsl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.254035 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe444947-6938-4000-8de5-462c8d0a42aa-config\") pod \"ovn-controller-metrics-7xcsl\" (UID: \"fe444947-6938-4000-8de5-462c8d0a42aa\") " pod="openstack/ovn-controller-metrics-7xcsl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.254061 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/fe444947-6938-4000-8de5-462c8d0a42aa-ovn-rundir\") pod \"ovn-controller-metrics-7xcsl\" (UID: \"fe444947-6938-4000-8de5-462c8d0a42aa\") " pod="openstack/ovn-controller-metrics-7xcsl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.254101 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe444947-6938-4000-8de5-462c8d0a42aa-combined-ca-bundle\") pod \"ovn-controller-metrics-7xcsl\" (UID: \"fe444947-6938-4000-8de5-462c8d0a42aa\") " pod="openstack/ovn-controller-metrics-7xcsl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.254178 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe444947-6938-4000-8de5-462c8d0a42aa-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7xcsl\" (UID: \"fe444947-6938-4000-8de5-462c8d0a42aa\") " pod="openstack/ovn-controller-metrics-7xcsl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.255386 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-bs7bl"] Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.257782 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.262057 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.282510 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-bs7bl"] Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.356177 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfqcl\" (UniqueName: \"kubernetes.io/projected/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4-kube-api-access-zfqcl\") pod \"dnsmasq-dns-7fd796d7df-bs7bl\" (UID: \"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4\") " pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.356250 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe444947-6938-4000-8de5-462c8d0a42aa-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7xcsl\" (UID: \"fe444947-6938-4000-8de5-462c8d0a42aa\") " pod="openstack/ovn-controller-metrics-7xcsl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.356298 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/fe444947-6938-4000-8de5-462c8d0a42aa-ovs-rundir\") pod \"ovn-controller-metrics-7xcsl\" (UID: \"fe444947-6938-4000-8de5-462c8d0a42aa\") " pod="openstack/ovn-controller-metrics-7xcsl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.356327 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lldq4\" (UniqueName: \"kubernetes.io/projected/fe444947-6938-4000-8de5-462c8d0a42aa-kube-api-access-lldq4\") pod \"ovn-controller-metrics-7xcsl\" (UID: \"fe444947-6938-4000-8de5-462c8d0a42aa\") " pod="openstack/ovn-controller-metrics-7xcsl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.356343 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-bs7bl\" (UID: \"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4\") " pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.357066 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe444947-6938-4000-8de5-462c8d0a42aa-config\") pod \"ovn-controller-metrics-7xcsl\" (UID: \"fe444947-6938-4000-8de5-462c8d0a42aa\") " pod="openstack/ovn-controller-metrics-7xcsl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.357096 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/fe444947-6938-4000-8de5-462c8d0a42aa-ovn-rundir\") pod \"ovn-controller-metrics-7xcsl\" (UID: \"fe444947-6938-4000-8de5-462c8d0a42aa\") " pod="openstack/ovn-controller-metrics-7xcsl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.357124 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-bs7bl\" (UID: \"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4\") " pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.357148 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4-config\") pod \"dnsmasq-dns-7fd796d7df-bs7bl\" (UID: \"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4\") " pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.357167 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe444947-6938-4000-8de5-462c8d0a42aa-combined-ca-bundle\") pod \"ovn-controller-metrics-7xcsl\" (UID: \"fe444947-6938-4000-8de5-462c8d0a42aa\") " pod="openstack/ovn-controller-metrics-7xcsl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.359417 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/fe444947-6938-4000-8de5-462c8d0a42aa-ovs-rundir\") pod \"ovn-controller-metrics-7xcsl\" (UID: \"fe444947-6938-4000-8de5-462c8d0a42aa\") " pod="openstack/ovn-controller-metrics-7xcsl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.359428 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/fe444947-6938-4000-8de5-462c8d0a42aa-ovn-rundir\") pod \"ovn-controller-metrics-7xcsl\" (UID: \"fe444947-6938-4000-8de5-462c8d0a42aa\") " pod="openstack/ovn-controller-metrics-7xcsl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.360579 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe444947-6938-4000-8de5-462c8d0a42aa-config\") pod \"ovn-controller-metrics-7xcsl\" (UID: \"fe444947-6938-4000-8de5-462c8d0a42aa\") " pod="openstack/ovn-controller-metrics-7xcsl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.366694 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe444947-6938-4000-8de5-462c8d0a42aa-combined-ca-bundle\") pod \"ovn-controller-metrics-7xcsl\" (UID: \"fe444947-6938-4000-8de5-462c8d0a42aa\") " pod="openstack/ovn-controller-metrics-7xcsl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.367104 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe444947-6938-4000-8de5-462c8d0a42aa-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7xcsl\" (UID: \"fe444947-6938-4000-8de5-462c8d0a42aa\") " pod="openstack/ovn-controller-metrics-7xcsl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.379827 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lldq4\" (UniqueName: \"kubernetes.io/projected/fe444947-6938-4000-8de5-462c8d0a42aa-kube-api-access-lldq4\") pod \"ovn-controller-metrics-7xcsl\" (UID: \"fe444947-6938-4000-8de5-462c8d0a42aa\") " pod="openstack/ovn-controller-metrics-7xcsl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.431058 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7xcsl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.463465 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-bs7bl\" (UID: \"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4\") " pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.463558 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-bs7bl\" (UID: \"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4\") " pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.463594 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4-config\") pod \"dnsmasq-dns-7fd796d7df-bs7bl\" (UID: \"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4\") " pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.463656 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfqcl\" (UniqueName: \"kubernetes.io/projected/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4-kube-api-access-zfqcl\") pod \"dnsmasq-dns-7fd796d7df-bs7bl\" (UID: \"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4\") " pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.468565 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-bs7bl\" (UID: \"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4\") " pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.470043 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4-config\") pod \"dnsmasq-dns-7fd796d7df-bs7bl\" (UID: \"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4\") " pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.470182 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-bs7bl\" (UID: \"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4\") " pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.476751 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6mxcw"] Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.496038 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfqcl\" (UniqueName: \"kubernetes.io/projected/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4-kube-api-access-zfqcl\") pod \"dnsmasq-dns-7fd796d7df-bs7bl\" (UID: \"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4\") " pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.519014 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-b628q"] Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.520921 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-b628q" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.524421 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.557907 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-b628q"] Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.628737 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.667077 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc41dd13-79db-4f94-a11b-bc0cf369bb76-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-b628q\" (UID: \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\") " pod="openstack/dnsmasq-dns-86db49b7ff-b628q" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.667154 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc41dd13-79db-4f94-a11b-bc0cf369bb76-config\") pod \"dnsmasq-dns-86db49b7ff-b628q\" (UID: \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\") " pod="openstack/dnsmasq-dns-86db49b7ff-b628q" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.667197 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc41dd13-79db-4f94-a11b-bc0cf369bb76-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-b628q\" (UID: \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\") " pod="openstack/dnsmasq-dns-86db49b7ff-b628q" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.667268 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm7tg\" (UniqueName: \"kubernetes.io/projected/dc41dd13-79db-4f94-a11b-bc0cf369bb76-kube-api-access-hm7tg\") pod \"dnsmasq-dns-86db49b7ff-b628q\" (UID: \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\") " pod="openstack/dnsmasq-dns-86db49b7ff-b628q" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.667391 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc41dd13-79db-4f94-a11b-bc0cf369bb76-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-b628q\" (UID: \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\") " pod="openstack/dnsmasq-dns-86db49b7ff-b628q" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.768557 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc41dd13-79db-4f94-a11b-bc0cf369bb76-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-b628q\" (UID: \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\") " pod="openstack/dnsmasq-dns-86db49b7ff-b628q" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.768625 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc41dd13-79db-4f94-a11b-bc0cf369bb76-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-b628q\" (UID: \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\") " pod="openstack/dnsmasq-dns-86db49b7ff-b628q" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.768680 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc41dd13-79db-4f94-a11b-bc0cf369bb76-config\") pod \"dnsmasq-dns-86db49b7ff-b628q\" (UID: \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\") " pod="openstack/dnsmasq-dns-86db49b7ff-b628q" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.768721 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc41dd13-79db-4f94-a11b-bc0cf369bb76-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-b628q\" (UID: \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\") " pod="openstack/dnsmasq-dns-86db49b7ff-b628q" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.768803 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm7tg\" (UniqueName: \"kubernetes.io/projected/dc41dd13-79db-4f94-a11b-bc0cf369bb76-kube-api-access-hm7tg\") pod \"dnsmasq-dns-86db49b7ff-b628q\" (UID: \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\") " pod="openstack/dnsmasq-dns-86db49b7ff-b628q" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.770059 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc41dd13-79db-4f94-a11b-bc0cf369bb76-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-b628q\" (UID: \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\") " pod="openstack/dnsmasq-dns-86db49b7ff-b628q" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.771101 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc41dd13-79db-4f94-a11b-bc0cf369bb76-config\") pod \"dnsmasq-dns-86db49b7ff-b628q\" (UID: \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\") " pod="openstack/dnsmasq-dns-86db49b7ff-b628q" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.771585 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc41dd13-79db-4f94-a11b-bc0cf369bb76-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-b628q\" (UID: \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\") " pod="openstack/dnsmasq-dns-86db49b7ff-b628q" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.773669 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc41dd13-79db-4f94-a11b-bc0cf369bb76-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-b628q\" (UID: \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\") " pod="openstack/dnsmasq-dns-86db49b7ff-b628q" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.793572 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm7tg\" (UniqueName: \"kubernetes.io/projected/dc41dd13-79db-4f94-a11b-bc0cf369bb76-kube-api-access-hm7tg\") pod \"dnsmasq-dns-86db49b7ff-b628q\" (UID: \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\") " pod="openstack/dnsmasq-dns-86db49b7ff-b628q" Mar 20 07:31:42 crc kubenswrapper[4749]: I0320 07:31:42.842029 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-b628q" Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.276165 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-qj5tr" Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.382086 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/978ab210-2b6f-468a-b2a6-4c1c80b5932e-config\") pod \"978ab210-2b6f-468a-b2a6-4c1c80b5932e\" (UID: \"978ab210-2b6f-468a-b2a6-4c1c80b5932e\") " Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.382237 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/978ab210-2b6f-468a-b2a6-4c1c80b5932e-dns-svc\") pod \"978ab210-2b6f-468a-b2a6-4c1c80b5932e\" (UID: \"978ab210-2b6f-468a-b2a6-4c1c80b5932e\") " Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.382334 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb77z\" (UniqueName: \"kubernetes.io/projected/978ab210-2b6f-468a-b2a6-4c1c80b5932e-kube-api-access-jb77z\") pod \"978ab210-2b6f-468a-b2a6-4c1c80b5932e\" (UID: \"978ab210-2b6f-468a-b2a6-4c1c80b5932e\") " Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.382634 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/978ab210-2b6f-468a-b2a6-4c1c80b5932e-config" (OuterVolumeSpecName: "config") pod "978ab210-2b6f-468a-b2a6-4c1c80b5932e" (UID: "978ab210-2b6f-468a-b2a6-4c1c80b5932e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.383100 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/978ab210-2b6f-468a-b2a6-4c1c80b5932e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "978ab210-2b6f-468a-b2a6-4c1c80b5932e" (UID: "978ab210-2b6f-468a-b2a6-4c1c80b5932e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.390125 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/978ab210-2b6f-468a-b2a6-4c1c80b5932e-kube-api-access-jb77z" (OuterVolumeSpecName: "kube-api-access-jb77z") pod "978ab210-2b6f-468a-b2a6-4c1c80b5932e" (UID: "978ab210-2b6f-468a-b2a6-4c1c80b5932e"). InnerVolumeSpecName "kube-api-access-jb77z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.419948 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-6mxcw" Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.484591 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/978ab210-2b6f-468a-b2a6-4c1c80b5932e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.484623 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb77z\" (UniqueName: \"kubernetes.io/projected/978ab210-2b6f-468a-b2a6-4c1c80b5932e-kube-api-access-jb77z\") on node \"crc\" DevicePath \"\"" Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.484633 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/978ab210-2b6f-468a-b2a6-4c1c80b5932e-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.585917 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5792dc4b-6da9-4a1b-9450-62d6748d79cf-dns-svc\") pod \"5792dc4b-6da9-4a1b-9450-62d6748d79cf\" (UID: \"5792dc4b-6da9-4a1b-9450-62d6748d79cf\") " Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.586091 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5792dc4b-6da9-4a1b-9450-62d6748d79cf-config\") pod \"5792dc4b-6da9-4a1b-9450-62d6748d79cf\" (UID: \"5792dc4b-6da9-4a1b-9450-62d6748d79cf\") " Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.586206 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pbpf\" (UniqueName: \"kubernetes.io/projected/5792dc4b-6da9-4a1b-9450-62d6748d79cf-kube-api-access-4pbpf\") pod \"5792dc4b-6da9-4a1b-9450-62d6748d79cf\" (UID: \"5792dc4b-6da9-4a1b-9450-62d6748d79cf\") " Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.586905 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5792dc4b-6da9-4a1b-9450-62d6748d79cf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5792dc4b-6da9-4a1b-9450-62d6748d79cf" (UID: "5792dc4b-6da9-4a1b-9450-62d6748d79cf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.587435 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5792dc4b-6da9-4a1b-9450-62d6748d79cf-config" (OuterVolumeSpecName: "config") pod "5792dc4b-6da9-4a1b-9450-62d6748d79cf" (UID: "5792dc4b-6da9-4a1b-9450-62d6748d79cf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.595568 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5792dc4b-6da9-4a1b-9450-62d6748d79cf-kube-api-access-4pbpf" (OuterVolumeSpecName: "kube-api-access-4pbpf") pod "5792dc4b-6da9-4a1b-9450-62d6748d79cf" (UID: "5792dc4b-6da9-4a1b-9450-62d6748d79cf"). InnerVolumeSpecName "kube-api-access-4pbpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.621821 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-6mxcw" event={"ID":"5792dc4b-6da9-4a1b-9450-62d6748d79cf","Type":"ContainerDied","Data":"bee93880d50a4be0f4b1014ab709c0f86ee4c46494066b94de9164045c5dc3a0"} Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.621910 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-6mxcw" Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.623914 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-qj5tr" event={"ID":"978ab210-2b6f-468a-b2a6-4c1c80b5932e","Type":"ContainerDied","Data":"e0551ea407906739757bf5dd242d29844549120d2ac53558f8f684a6214254f5"} Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.624249 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-qj5tr" Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.685855 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6mxcw"] Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.690173 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5792dc4b-6da9-4a1b-9450-62d6748d79cf-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.690208 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5792dc4b-6da9-4a1b-9450-62d6748d79cf-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.690220 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pbpf\" (UniqueName: \"kubernetes.io/projected/5792dc4b-6da9-4a1b-9450-62d6748d79cf-kube-api-access-4pbpf\") on node \"crc\" DevicePath \"\"" Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.698217 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6mxcw"] Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.711422 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-qj5tr"] Mar 20 07:31:43 crc kubenswrapper[4749]: I0320 07:31:43.716081 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-qj5tr"] Mar 20 07:31:44 crc kubenswrapper[4749]: I0320 07:31:44.219209 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5792dc4b-6da9-4a1b-9450-62d6748d79cf" path="/var/lib/kubelet/pods/5792dc4b-6da9-4a1b-9450-62d6748d79cf/volumes" Mar 20 07:31:44 crc kubenswrapper[4749]: I0320 07:31:44.219871 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="978ab210-2b6f-468a-b2a6-4c1c80b5932e" path="/var/lib/kubelet/pods/978ab210-2b6f-468a-b2a6-4c1c80b5932e/volumes" Mar 20 07:31:48 crc kubenswrapper[4749]: I0320 07:31:48.091797 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-b628q"] Mar 20 07:31:48 crc kubenswrapper[4749]: W0320 07:31:48.095376 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc41dd13_79db_4f94_a11b_bc0cf369bb76.slice/crio-60834df33a440ec4f4e87dcab4614a7222f8d5230c9617b63bf99abc593dbf59 WatchSource:0}: Error finding container 60834df33a440ec4f4e87dcab4614a7222f8d5230c9617b63bf99abc593dbf59: Status 404 returned error can't find the container with id 60834df33a440ec4f4e87dcab4614a7222f8d5230c9617b63bf99abc593dbf59 Mar 20 07:31:48 crc kubenswrapper[4749]: I0320 07:31:48.144460 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-7xcsl"] Mar 20 07:31:48 crc kubenswrapper[4749]: W0320 07:31:48.145740 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe444947_6938_4000_8de5_462c8d0a42aa.slice/crio-741be957f5bdab22abed5c4915e840ceb59bf42680aa2a81624fa326991f4fc7 WatchSource:0}: Error finding container 741be957f5bdab22abed5c4915e840ceb59bf42680aa2a81624fa326991f4fc7: Status 404 returned error can't find the container with id 741be957f5bdab22abed5c4915e840ceb59bf42680aa2a81624fa326991f4fc7 Mar 20 07:31:48 crc kubenswrapper[4749]: I0320 07:31:48.157328 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-bs7bl"] Mar 20 07:31:48 crc kubenswrapper[4749]: W0320 07:31:48.163150 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d445ec2_8c9d_4312_9b9e_fcc997d5e0c4.slice/crio-b8842ddc36334176347958e60c83d16d5bd52f2d8d3a54e76a33bfbc8376230a WatchSource:0}: Error finding container b8842ddc36334176347958e60c83d16d5bd52f2d8d3a54e76a33bfbc8376230a: Status 404 returned error can't find the container with id b8842ddc36334176347958e60c83d16d5bd52f2d8d3a54e76a33bfbc8376230a Mar 20 07:31:48 crc kubenswrapper[4749]: I0320 07:31:48.675539 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" event={"ID":"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4","Type":"ContainerStarted","Data":"b8842ddc36334176347958e60c83d16d5bd52f2d8d3a54e76a33bfbc8376230a"} Mar 20 07:31:48 crc kubenswrapper[4749]: I0320 07:31:48.677615 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"541ae4e0-d6e7-4ad9-8451-8f5b840050de","Type":"ContainerStarted","Data":"fdb88768b5f46b5cfdab2a2578effe8ee81e089bccfb435d38b60c14b096d119"} Mar 20 07:31:48 crc kubenswrapper[4749]: I0320 07:31:48.678792 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7xcsl" event={"ID":"fe444947-6938-4000-8de5-462c8d0a42aa","Type":"ContainerStarted","Data":"741be957f5bdab22abed5c4915e840ceb59bf42680aa2a81624fa326991f4fc7"} Mar 20 07:31:48 crc kubenswrapper[4749]: I0320 07:31:48.692572 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-b628q" event={"ID":"dc41dd13-79db-4f94-a11b-bc0cf369bb76","Type":"ContainerStarted","Data":"60834df33a440ec4f4e87dcab4614a7222f8d5230c9617b63bf99abc593dbf59"} Mar 20 07:31:48 crc kubenswrapper[4749]: I0320 07:31:48.697851 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-tx9bw" event={"ID":"32ceaa95-18d9-4f1e-9ebd-f2d413709413","Type":"ContainerStarted","Data":"55b0ee4fc434859307961407711a9225013211971849b172d0e422afb88c9390"} Mar 20 07:31:48 crc kubenswrapper[4749]: I0320 07:31:48.698695 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-tx9bw" Mar 20 07:31:48 crc kubenswrapper[4749]: I0320 07:31:48.701411 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8052bc33-6f6a-437e-9df5-508256f7e32f","Type":"ContainerStarted","Data":"11d2ce764eb7b4c4c511f3a78c63b6db96646d56a0169fa379c8ef66f658c7cb"} Mar 20 07:31:48 crc kubenswrapper[4749]: I0320 07:31:48.706475 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1c96afef-fa85-45f2-89cd-2fb2db26b9f8","Type":"ContainerStarted","Data":"39a7175ea606af7c07c2df47f03c8ec663b30621b93b90039594c9cd12b2413b"} Mar 20 07:31:48 crc kubenswrapper[4749]: I0320 07:31:48.708550 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b364d204-dcba-4b43-98e1-f1e22bd89b2c","Type":"ContainerStarted","Data":"4dcf8275f054b0b2d0226379ce0df32f2d4719d61acd8fcb8d49ad2037493e7c"} Mar 20 07:31:48 crc kubenswrapper[4749]: I0320 07:31:48.708914 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 07:31:48 crc kubenswrapper[4749]: I0320 07:31:48.711033 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a3723f09-8ae3-44e2-b5c7-7824e62755f7","Type":"ContainerStarted","Data":"2d29bb8513293705271d3e76ab564014f13c18956242bfdcd3fede3ca5027e7b"} Mar 20 07:31:48 crc kubenswrapper[4749]: I0320 07:31:48.716517 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-tx9bw" podStartSLOduration=9.745348451 podStartE2EDuration="19.716498827s" podCreationTimestamp="2026-03-20 07:31:29 +0000 UTC" firstStartedPulling="2026-03-20 07:31:37.676478667 +0000 UTC m=+1134.226136314" lastFinishedPulling="2026-03-20 07:31:47.647629043 +0000 UTC m=+1144.197286690" observedRunningTime="2026-03-20 07:31:48.711782513 +0000 UTC m=+1145.261440160" watchObservedRunningTime="2026-03-20 07:31:48.716498827 +0000 UTC m=+1145.266156474" Mar 20 07:31:48 crc kubenswrapper[4749]: I0320 07:31:48.717566 4749 generic.go:334] "Generic (PLEG): container finished" podID="d72e69d0-23f3-4d14-ab35-74ea19e79b69" containerID="156684dc537f45a6c2243dee7cac53811a46908dc2142971adf5ba1b3af4d068" exitCode=0 Mar 20 07:31:48 crc kubenswrapper[4749]: I0320 07:31:48.717703 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kvqdd" event={"ID":"d72e69d0-23f3-4d14-ab35-74ea19e79b69","Type":"ContainerDied","Data":"156684dc537f45a6c2243dee7cac53811a46908dc2142971adf5ba1b3af4d068"} Mar 20 07:31:48 crc kubenswrapper[4749]: I0320 07:31:48.721176 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056","Type":"ContainerStarted","Data":"e26eb7a6ec1e701e8186481b6e7ebd75635d285746d6529a220f1e7c8dff91e2"} Mar 20 07:31:48 crc kubenswrapper[4749]: I0320 07:31:48.721364 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 20 07:31:48 crc kubenswrapper[4749]: I0320 07:31:48.800059 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=12.781379078 podStartE2EDuration="22.800036309s" podCreationTimestamp="2026-03-20 07:31:26 +0000 UTC" firstStartedPulling="2026-03-20 07:31:37.667184031 +0000 UTC m=+1134.216841678" lastFinishedPulling="2026-03-20 07:31:47.685841262 +0000 UTC m=+1144.235498909" observedRunningTime="2026-03-20 07:31:48.776033995 +0000 UTC m=+1145.325691642" watchObservedRunningTime="2026-03-20 07:31:48.800036309 +0000 UTC m=+1145.349694036" Mar 20 07:31:48 crc kubenswrapper[4749]: I0320 07:31:48.826237 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=16.183933868 podStartE2EDuration="25.826213636s" podCreationTimestamp="2026-03-20 07:31:23 +0000 UTC" firstStartedPulling="2026-03-20 07:31:37.509838034 +0000 UTC m=+1134.059495681" lastFinishedPulling="2026-03-20 07:31:47.152117802 +0000 UTC m=+1143.701775449" observedRunningTime="2026-03-20 07:31:48.79387741 +0000 UTC m=+1145.343535057" watchObservedRunningTime="2026-03-20 07:31:48.826213636 +0000 UTC m=+1145.375871283" Mar 20 07:31:49 crc kubenswrapper[4749]: I0320 07:31:49.730713 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kvqdd" event={"ID":"d72e69d0-23f3-4d14-ab35-74ea19e79b69","Type":"ContainerStarted","Data":"1994a4cf513b16cf45ab71ed2796aa7c1fbe48ed6467127549b163bafc340025"} Mar 20 07:31:49 crc kubenswrapper[4749]: I0320 07:31:49.731047 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-kvqdd" Mar 20 07:31:49 crc kubenswrapper[4749]: I0320 07:31:49.731065 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-kvqdd" Mar 20 07:31:49 crc kubenswrapper[4749]: I0320 07:31:49.731077 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kvqdd" event={"ID":"d72e69d0-23f3-4d14-ab35-74ea19e79b69","Type":"ContainerStarted","Data":"f8410ab5cfe3027210f96be703b0a26e6e1376d125bae2e7fbdcc193cdfb91e7"} Mar 20 07:31:49 crc kubenswrapper[4749]: I0320 07:31:49.732602 4749 generic.go:334] "Generic (PLEG): container finished" podID="dc41dd13-79db-4f94-a11b-bc0cf369bb76" containerID="2644c3607f0576b1dc2cf9c609f794a4de541fac2b85b056baab58e0a9bad0f6" exitCode=0 Mar 20 07:31:49 crc kubenswrapper[4749]: I0320 07:31:49.732668 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-b628q" event={"ID":"dc41dd13-79db-4f94-a11b-bc0cf369bb76","Type":"ContainerDied","Data":"2644c3607f0576b1dc2cf9c609f794a4de541fac2b85b056baab58e0a9bad0f6"} Mar 20 07:31:49 crc kubenswrapper[4749]: I0320 07:31:49.734701 4749 generic.go:334] "Generic (PLEG): container finished" podID="3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4" containerID="9e81ae96719597a58ce4ec18589543b8aa2374144c7e245c3e14e1b5c7adc1ea" exitCode=0 Mar 20 07:31:49 crc kubenswrapper[4749]: I0320 07:31:49.734788 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" event={"ID":"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4","Type":"ContainerDied","Data":"9e81ae96719597a58ce4ec18589543b8aa2374144c7e245c3e14e1b5c7adc1ea"} Mar 20 07:31:49 crc kubenswrapper[4749]: I0320 07:31:49.758531 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-kvqdd" podStartSLOduration=11.068864188 podStartE2EDuration="20.758511979s" podCreationTimestamp="2026-03-20 07:31:29 +0000 UTC" firstStartedPulling="2026-03-20 07:31:37.914054954 +0000 UTC m=+1134.463712601" lastFinishedPulling="2026-03-20 07:31:47.603702735 +0000 UTC m=+1144.153360392" observedRunningTime="2026-03-20 07:31:49.752960164 +0000 UTC m=+1146.302617811" watchObservedRunningTime="2026-03-20 07:31:49.758511979 +0000 UTC m=+1146.308169626" Mar 20 07:31:51 crc kubenswrapper[4749]: I0320 07:31:51.751180 4749 generic.go:334] "Generic (PLEG): container finished" podID="8052bc33-6f6a-437e-9df5-508256f7e32f" containerID="11d2ce764eb7b4c4c511f3a78c63b6db96646d56a0169fa379c8ef66f658c7cb" exitCode=0 Mar 20 07:31:51 crc kubenswrapper[4749]: I0320 07:31:51.751724 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8052bc33-6f6a-437e-9df5-508256f7e32f","Type":"ContainerDied","Data":"11d2ce764eb7b4c4c511f3a78c63b6db96646d56a0169fa379c8ef66f658c7cb"} Mar 20 07:31:51 crc kubenswrapper[4749]: I0320 07:31:51.755162 4749 generic.go:334] "Generic (PLEG): container finished" podID="1c96afef-fa85-45f2-89cd-2fb2db26b9f8" containerID="39a7175ea606af7c07c2df47f03c8ec663b30621b93b90039594c9cd12b2413b" exitCode=0 Mar 20 07:31:51 crc kubenswrapper[4749]: I0320 07:31:51.755720 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1c96afef-fa85-45f2-89cd-2fb2db26b9f8","Type":"ContainerDied","Data":"39a7175ea606af7c07c2df47f03c8ec663b30621b93b90039594c9cd12b2413b"} Mar 20 07:31:52 crc kubenswrapper[4749]: I0320 07:31:52.768096 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a3723f09-8ae3-44e2-b5c7-7824e62755f7","Type":"ContainerStarted","Data":"335fc9b4cf005b8be35985f741fb47d9b71a811ca307f14d9b774bd7202f421f"} Mar 20 07:31:52 crc kubenswrapper[4749]: I0320 07:31:52.771547 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-b628q" event={"ID":"dc41dd13-79db-4f94-a11b-bc0cf369bb76","Type":"ContainerStarted","Data":"ca58e44aa9e9389828aedd3563fe417bbd6fd7d1c1ff6a1a879bebe6d3529496"} Mar 20 07:31:52 crc kubenswrapper[4749]: I0320 07:31:52.771730 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-b628q" Mar 20 07:31:52 crc kubenswrapper[4749]: I0320 07:31:52.776580 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8052bc33-6f6a-437e-9df5-508256f7e32f","Type":"ContainerStarted","Data":"43e5ab5fd4a92b3b3c671ea2c1ba5b0f25ef2a104dfbf5440b7a0c672a36506f"} Mar 20 07:31:52 crc kubenswrapper[4749]: I0320 07:31:52.782530 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" event={"ID":"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4","Type":"ContainerStarted","Data":"8b852fdd91afb15c217e32231e5c131a07af4e87476f57409fb1673527101b38"} Mar 20 07:31:52 crc kubenswrapper[4749]: I0320 07:31:52.782686 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" Mar 20 07:31:52 crc kubenswrapper[4749]: I0320 07:31:52.786010 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"541ae4e0-d6e7-4ad9-8451-8f5b840050de","Type":"ContainerStarted","Data":"8ed8b7b392bdbc11e11cf7b2acfedd90d434022e5455fd263d1e0be24d0fa0a2"} Mar 20 07:31:52 crc kubenswrapper[4749]: I0320 07:31:52.788470 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7xcsl" event={"ID":"fe444947-6938-4000-8de5-462c8d0a42aa","Type":"ContainerStarted","Data":"a79eaaf57b44ba8558d827663b46aead6ee6d28c6eece07267d6004f13c72cc3"} Mar 20 07:31:52 crc kubenswrapper[4749]: I0320 07:31:52.792075 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1c96afef-fa85-45f2-89cd-2fb2db26b9f8","Type":"ContainerStarted","Data":"c49276307f914c15ec95ee85e65fff436732ffdacbee598ebe9ddb16a87cddf6"} Mar 20 07:31:52 crc kubenswrapper[4749]: I0320 07:31:52.826027 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=9.832725327 podStartE2EDuration="23.82599674s" podCreationTimestamp="2026-03-20 07:31:29 +0000 UTC" firstStartedPulling="2026-03-20 07:31:37.732092629 +0000 UTC m=+1134.281750276" lastFinishedPulling="2026-03-20 07:31:51.725364032 +0000 UTC m=+1148.275021689" observedRunningTime="2026-03-20 07:31:52.807958621 +0000 UTC m=+1149.357616308" watchObservedRunningTime="2026-03-20 07:31:52.82599674 +0000 UTC m=+1149.375654427" Mar 20 07:31:52 crc kubenswrapper[4749]: I0320 07:31:52.845762 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=8.865977344000001 podStartE2EDuration="21.84573808s" podCreationTimestamp="2026-03-20 07:31:31 +0000 UTC" firstStartedPulling="2026-03-20 07:31:38.720597259 +0000 UTC m=+1135.270254906" lastFinishedPulling="2026-03-20 07:31:51.700357995 +0000 UTC m=+1148.250015642" observedRunningTime="2026-03-20 07:31:52.83298295 +0000 UTC m=+1149.382640637" watchObservedRunningTime="2026-03-20 07:31:52.84573808 +0000 UTC m=+1149.395395767" Mar 20 07:31:52 crc kubenswrapper[4749]: I0320 07:31:52.858909 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" podStartSLOduration=10.383205351 podStartE2EDuration="10.858882849s" podCreationTimestamp="2026-03-20 07:31:42 +0000 UTC" firstStartedPulling="2026-03-20 07:31:48.164703948 +0000 UTC m=+1144.714361595" lastFinishedPulling="2026-03-20 07:31:48.640381446 +0000 UTC m=+1145.190039093" observedRunningTime="2026-03-20 07:31:52.852624408 +0000 UTC m=+1149.402282095" watchObservedRunningTime="2026-03-20 07:31:52.858882849 +0000 UTC m=+1149.408540526" Mar 20 07:31:52 crc kubenswrapper[4749]: I0320 07:31:52.887697 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-7xcsl" podStartSLOduration=7.370865132 podStartE2EDuration="10.88767625s" podCreationTimestamp="2026-03-20 07:31:42 +0000 UTC" firstStartedPulling="2026-03-20 07:31:48.149426106 +0000 UTC m=+1144.699083763" lastFinishedPulling="2026-03-20 07:31:51.666237224 +0000 UTC m=+1148.215894881" observedRunningTime="2026-03-20 07:31:52.871075736 +0000 UTC m=+1149.420733423" watchObservedRunningTime="2026-03-20 07:31:52.88767625 +0000 UTC m=+1149.437333907" Mar 20 07:31:52 crc kubenswrapper[4749]: I0320 07:31:52.903726 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=21.719144762 podStartE2EDuration="31.903706279s" podCreationTimestamp="2026-03-20 07:31:21 +0000 UTC" firstStartedPulling="2026-03-20 07:31:37.515168243 +0000 UTC m=+1134.064825910" lastFinishedPulling="2026-03-20 07:31:47.69972978 +0000 UTC m=+1144.249387427" observedRunningTime="2026-03-20 07:31:52.900610744 +0000 UTC m=+1149.450268411" watchObservedRunningTime="2026-03-20 07:31:52.903706279 +0000 UTC m=+1149.453363936" Mar 20 07:31:52 crc kubenswrapper[4749]: I0320 07:31:52.931362 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=20.664031733 podStartE2EDuration="30.931339962s" podCreationTimestamp="2026-03-20 07:31:22 +0000 UTC" firstStartedPulling="2026-03-20 07:31:37.418581134 +0000 UTC m=+1133.968238791" lastFinishedPulling="2026-03-20 07:31:47.685889373 +0000 UTC m=+1144.235547020" observedRunningTime="2026-03-20 07:31:52.923808398 +0000 UTC m=+1149.473466105" watchObservedRunningTime="2026-03-20 07:31:52.931339962 +0000 UTC m=+1149.480997609" Mar 20 07:31:52 crc kubenswrapper[4749]: I0320 07:31:52.952942 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-b628q" podStartSLOduration=10.517952349 podStartE2EDuration="10.952908796s" podCreationTimestamp="2026-03-20 07:31:42 +0000 UTC" firstStartedPulling="2026-03-20 07:31:48.099417441 +0000 UTC m=+1144.649075088" lastFinishedPulling="2026-03-20 07:31:48.534373898 +0000 UTC m=+1145.084031535" observedRunningTime="2026-03-20 07:31:52.940443063 +0000 UTC m=+1149.490100720" watchObservedRunningTime="2026-03-20 07:31:52.952908796 +0000 UTC m=+1149.502566483" Mar 20 07:31:53 crc kubenswrapper[4749]: I0320 07:31:53.065087 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:53 crc kubenswrapper[4749]: I0320 07:31:53.807011 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerStarted","Data":"30cf25cee069fd79718872b52ff67190111f7e963a3d6bd02d0024f6aff141bb"} Mar 20 07:31:53 crc kubenswrapper[4749]: I0320 07:31:53.826071 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:53 crc kubenswrapper[4749]: I0320 07:31:53.826153 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:54 crc kubenswrapper[4749]: I0320 07:31:54.065595 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:54 crc kubenswrapper[4749]: I0320 07:31:54.125112 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 20 07:31:54 crc kubenswrapper[4749]: I0320 07:31:54.130735 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:54 crc kubenswrapper[4749]: I0320 07:31:54.539479 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:54 crc kubenswrapper[4749]: I0320 07:31:54.593588 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:54 crc kubenswrapper[4749]: I0320 07:31:54.813266 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:54 crc kubenswrapper[4749]: I0320 07:31:54.863210 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 20 07:31:54 crc kubenswrapper[4749]: I0320 07:31:54.864155 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.377481 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.379631 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.382827 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.383417 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.383666 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-9bnq2" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.383904 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.388744 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.542093 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b89f88dc-7614-4f05-ad26-dc1d46d10b85-scripts\") pod \"ovn-northd-0\" (UID: \"b89f88dc-7614-4f05-ad26-dc1d46d10b85\") " pod="openstack/ovn-northd-0" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.542412 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b89f88dc-7614-4f05-ad26-dc1d46d10b85-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b89f88dc-7614-4f05-ad26-dc1d46d10b85\") " pod="openstack/ovn-northd-0" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.542491 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b89f88dc-7614-4f05-ad26-dc1d46d10b85-config\") pod \"ovn-northd-0\" (UID: \"b89f88dc-7614-4f05-ad26-dc1d46d10b85\") " pod="openstack/ovn-northd-0" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.542509 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz94m\" (UniqueName: \"kubernetes.io/projected/b89f88dc-7614-4f05-ad26-dc1d46d10b85-kube-api-access-vz94m\") pod \"ovn-northd-0\" (UID: \"b89f88dc-7614-4f05-ad26-dc1d46d10b85\") " pod="openstack/ovn-northd-0" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.542547 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b89f88dc-7614-4f05-ad26-dc1d46d10b85-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b89f88dc-7614-4f05-ad26-dc1d46d10b85\") " pod="openstack/ovn-northd-0" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.542594 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89f88dc-7614-4f05-ad26-dc1d46d10b85-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b89f88dc-7614-4f05-ad26-dc1d46d10b85\") " pod="openstack/ovn-northd-0" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.542826 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b89f88dc-7614-4f05-ad26-dc1d46d10b85-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b89f88dc-7614-4f05-ad26-dc1d46d10b85\") " pod="openstack/ovn-northd-0" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.644333 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b89f88dc-7614-4f05-ad26-dc1d46d10b85-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b89f88dc-7614-4f05-ad26-dc1d46d10b85\") " pod="openstack/ovn-northd-0" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.644389 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b89f88dc-7614-4f05-ad26-dc1d46d10b85-scripts\") pod \"ovn-northd-0\" (UID: \"b89f88dc-7614-4f05-ad26-dc1d46d10b85\") " pod="openstack/ovn-northd-0" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.644411 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b89f88dc-7614-4f05-ad26-dc1d46d10b85-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b89f88dc-7614-4f05-ad26-dc1d46d10b85\") " pod="openstack/ovn-northd-0" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.644461 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b89f88dc-7614-4f05-ad26-dc1d46d10b85-config\") pod \"ovn-northd-0\" (UID: \"b89f88dc-7614-4f05-ad26-dc1d46d10b85\") " pod="openstack/ovn-northd-0" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.644478 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz94m\" (UniqueName: \"kubernetes.io/projected/b89f88dc-7614-4f05-ad26-dc1d46d10b85-kube-api-access-vz94m\") pod \"ovn-northd-0\" (UID: \"b89f88dc-7614-4f05-ad26-dc1d46d10b85\") " pod="openstack/ovn-northd-0" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.644497 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b89f88dc-7614-4f05-ad26-dc1d46d10b85-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b89f88dc-7614-4f05-ad26-dc1d46d10b85\") " pod="openstack/ovn-northd-0" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.644526 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89f88dc-7614-4f05-ad26-dc1d46d10b85-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b89f88dc-7614-4f05-ad26-dc1d46d10b85\") " pod="openstack/ovn-northd-0" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.645267 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b89f88dc-7614-4f05-ad26-dc1d46d10b85-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b89f88dc-7614-4f05-ad26-dc1d46d10b85\") " pod="openstack/ovn-northd-0" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.645613 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b89f88dc-7614-4f05-ad26-dc1d46d10b85-scripts\") pod \"ovn-northd-0\" (UID: \"b89f88dc-7614-4f05-ad26-dc1d46d10b85\") " pod="openstack/ovn-northd-0" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.645648 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b89f88dc-7614-4f05-ad26-dc1d46d10b85-config\") pod \"ovn-northd-0\" (UID: \"b89f88dc-7614-4f05-ad26-dc1d46d10b85\") " pod="openstack/ovn-northd-0" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.649849 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b89f88dc-7614-4f05-ad26-dc1d46d10b85-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b89f88dc-7614-4f05-ad26-dc1d46d10b85\") " pod="openstack/ovn-northd-0" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.650225 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89f88dc-7614-4f05-ad26-dc1d46d10b85-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b89f88dc-7614-4f05-ad26-dc1d46d10b85\") " pod="openstack/ovn-northd-0" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.656068 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b89f88dc-7614-4f05-ad26-dc1d46d10b85-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b89f88dc-7614-4f05-ad26-dc1d46d10b85\") " pod="openstack/ovn-northd-0" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.667337 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz94m\" (UniqueName: \"kubernetes.io/projected/b89f88dc-7614-4f05-ad26-dc1d46d10b85-kube-api-access-vz94m\") pod \"ovn-northd-0\" (UID: \"b89f88dc-7614-4f05-ad26-dc1d46d10b85\") " pod="openstack/ovn-northd-0" Mar 20 07:31:55 crc kubenswrapper[4749]: I0320 07:31:55.704449 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.139083 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 07:31:56 crc kubenswrapper[4749]: W0320 07:31:56.142608 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb89f88dc_7614_4f05_ad26_dc1d46d10b85.slice/crio-59ac2abfeece8e6d4a4b5636b1fa496d036cd723a09804c7b410749766865125 WatchSource:0}: Error finding container 59ac2abfeece8e6d4a4b5636b1fa496d036cd723a09804c7b410749766865125: Status 404 returned error can't find the container with id 59ac2abfeece8e6d4a4b5636b1fa496d036cd723a09804c7b410749766865125 Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.436985 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-bs7bl"] Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.437248 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" podUID="3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4" containerName="dnsmasq-dns" containerID="cri-o://8b852fdd91afb15c217e32231e5c131a07af4e87476f57409fb1673527101b38" gracePeriod=10 Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.438456 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.467601 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.482679 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-f5tdv"] Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.484349 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-f5tdv" Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.497741 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-f5tdv"] Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.665440 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b30e9b27-acb7-4df2-b745-482fe080f360-config\") pod \"dnsmasq-dns-698758b865-f5tdv\" (UID: \"b30e9b27-acb7-4df2-b745-482fe080f360\") " pod="openstack/dnsmasq-dns-698758b865-f5tdv" Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.665935 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9btfr\" (UniqueName: \"kubernetes.io/projected/b30e9b27-acb7-4df2-b745-482fe080f360-kube-api-access-9btfr\") pod \"dnsmasq-dns-698758b865-f5tdv\" (UID: \"b30e9b27-acb7-4df2-b745-482fe080f360\") " pod="openstack/dnsmasq-dns-698758b865-f5tdv" Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.665966 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b30e9b27-acb7-4df2-b745-482fe080f360-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-f5tdv\" (UID: \"b30e9b27-acb7-4df2-b745-482fe080f360\") " pod="openstack/dnsmasq-dns-698758b865-f5tdv" Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.666002 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b30e9b27-acb7-4df2-b745-482fe080f360-dns-svc\") pod \"dnsmasq-dns-698758b865-f5tdv\" (UID: \"b30e9b27-acb7-4df2-b745-482fe080f360\") " pod="openstack/dnsmasq-dns-698758b865-f5tdv" Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.666043 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b30e9b27-acb7-4df2-b745-482fe080f360-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-f5tdv\" (UID: \"b30e9b27-acb7-4df2-b745-482fe080f360\") " pod="openstack/dnsmasq-dns-698758b865-f5tdv" Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.769367 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b30e9b27-acb7-4df2-b745-482fe080f360-dns-svc\") pod \"dnsmasq-dns-698758b865-f5tdv\" (UID: \"b30e9b27-acb7-4df2-b745-482fe080f360\") " pod="openstack/dnsmasq-dns-698758b865-f5tdv" Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.769434 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b30e9b27-acb7-4df2-b745-482fe080f360-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-f5tdv\" (UID: \"b30e9b27-acb7-4df2-b745-482fe080f360\") " pod="openstack/dnsmasq-dns-698758b865-f5tdv" Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.769501 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b30e9b27-acb7-4df2-b745-482fe080f360-config\") pod \"dnsmasq-dns-698758b865-f5tdv\" (UID: \"b30e9b27-acb7-4df2-b745-482fe080f360\") " pod="openstack/dnsmasq-dns-698758b865-f5tdv" Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.769603 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9btfr\" (UniqueName: \"kubernetes.io/projected/b30e9b27-acb7-4df2-b745-482fe080f360-kube-api-access-9btfr\") pod \"dnsmasq-dns-698758b865-f5tdv\" (UID: \"b30e9b27-acb7-4df2-b745-482fe080f360\") " pod="openstack/dnsmasq-dns-698758b865-f5tdv" Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.769628 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b30e9b27-acb7-4df2-b745-482fe080f360-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-f5tdv\" (UID: \"b30e9b27-acb7-4df2-b745-482fe080f360\") " pod="openstack/dnsmasq-dns-698758b865-f5tdv" Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.770422 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b30e9b27-acb7-4df2-b745-482fe080f360-dns-svc\") pod \"dnsmasq-dns-698758b865-f5tdv\" (UID: \"b30e9b27-acb7-4df2-b745-482fe080f360\") " pod="openstack/dnsmasq-dns-698758b865-f5tdv" Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.771021 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b30e9b27-acb7-4df2-b745-482fe080f360-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-f5tdv\" (UID: \"b30e9b27-acb7-4df2-b745-482fe080f360\") " pod="openstack/dnsmasq-dns-698758b865-f5tdv" Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.771383 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b30e9b27-acb7-4df2-b745-482fe080f360-config\") pod \"dnsmasq-dns-698758b865-f5tdv\" (UID: \"b30e9b27-acb7-4df2-b745-482fe080f360\") " pod="openstack/dnsmasq-dns-698758b865-f5tdv" Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.772382 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b30e9b27-acb7-4df2-b745-482fe080f360-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-f5tdv\" (UID: \"b30e9b27-acb7-4df2-b745-482fe080f360\") " pod="openstack/dnsmasq-dns-698758b865-f5tdv" Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.790110 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9btfr\" (UniqueName: \"kubernetes.io/projected/b30e9b27-acb7-4df2-b745-482fe080f360-kube-api-access-9btfr\") pod \"dnsmasq-dns-698758b865-f5tdv\" (UID: \"b30e9b27-acb7-4df2-b745-482fe080f360\") " pod="openstack/dnsmasq-dns-698758b865-f5tdv" Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.836242 4749 generic.go:334] "Generic (PLEG): container finished" podID="3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4" containerID="8b852fdd91afb15c217e32231e5c131a07af4e87476f57409fb1673527101b38" exitCode=0 Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.836302 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" event={"ID":"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4","Type":"ContainerDied","Data":"8b852fdd91afb15c217e32231e5c131a07af4e87476f57409fb1673527101b38"} Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.837578 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b89f88dc-7614-4f05-ad26-dc1d46d10b85","Type":"ContainerStarted","Data":"59ac2abfeece8e6d4a4b5636b1fa496d036cd723a09804c7b410749766865125"} Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.859942 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-f5tdv" Mar 20 07:31:56 crc kubenswrapper[4749]: I0320 07:31:56.973908 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.075243 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfqcl\" (UniqueName: \"kubernetes.io/projected/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4-kube-api-access-zfqcl\") pod \"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4\" (UID: \"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4\") " Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.075542 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4-config\") pod \"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4\" (UID: \"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4\") " Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.075593 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4-dns-svc\") pod \"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4\" (UID: \"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4\") " Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.075676 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4-ovsdbserver-nb\") pod \"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4\" (UID: \"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4\") " Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.085928 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4-kube-api-access-zfqcl" (OuterVolumeSpecName: "kube-api-access-zfqcl") pod "3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4" (UID: "3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4"). InnerVolumeSpecName "kube-api-access-zfqcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.115903 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4" (UID: "3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.121029 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4" (UID: "3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.132392 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4-config" (OuterVolumeSpecName: "config") pod "3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4" (UID: "3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.177825 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.177871 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfqcl\" (UniqueName: \"kubernetes.io/projected/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4-kube-api-access-zfqcl\") on node \"crc\" DevicePath \"\"" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.177888 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.177900 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.303079 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-f5tdv"] Mar 20 07:31:57 crc kubenswrapper[4749]: W0320 07:31:57.358797 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb30e9b27_acb7_4df2_b745_482fe080f360.slice/crio-1c7761b5b2d50de8a15920d4f1291212010e8a0c8bb15e87825baac84ae8e300 WatchSource:0}: Error finding container 1c7761b5b2d50de8a15920d4f1291212010e8a0c8bb15e87825baac84ae8e300: Status 404 returned error can't find the container with id 1c7761b5b2d50de8a15920d4f1291212010e8a0c8bb15e87825baac84ae8e300 Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.653664 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 20 07:31:57 crc kubenswrapper[4749]: E0320 07:31:57.654022 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4" containerName="init" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.654042 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4" containerName="init" Mar 20 07:31:57 crc kubenswrapper[4749]: E0320 07:31:57.654096 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4" containerName="dnsmasq-dns" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.654104 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4" containerName="dnsmasq-dns" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.654305 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4" containerName="dnsmasq-dns" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.663335 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.667170 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-m2fbf" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.667562 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.667936 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.669553 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.680400 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.788570 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26pkj\" (UniqueName: \"kubernetes.io/projected/8272956e-b31a-4bd8-9118-3ca9721e6d75-kube-api-access-26pkj\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") " pod="openstack/swift-storage-0" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.788648 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8272956e-b31a-4bd8-9118-3ca9721e6d75-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") " pod="openstack/swift-storage-0" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.788691 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") " pod="openstack/swift-storage-0" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.788719 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8272956e-b31a-4bd8-9118-3ca9721e6d75-etc-swift\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") " pod="openstack/swift-storage-0" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.788743 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8272956e-b31a-4bd8-9118-3ca9721e6d75-lock\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") " pod="openstack/swift-storage-0" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.788819 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8272956e-b31a-4bd8-9118-3ca9721e6d75-cache\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") " pod="openstack/swift-storage-0" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.843524 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-b628q" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.846683 4749 generic.go:334] "Generic (PLEG): container finished" podID="b30e9b27-acb7-4df2-b745-482fe080f360" containerID="d9ebc69cd32fea04c5ad0f326e861b49fe2bfdfd0c4f00aff870db5d07fe495b" exitCode=0 Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.846789 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-f5tdv" event={"ID":"b30e9b27-acb7-4df2-b745-482fe080f360","Type":"ContainerDied","Data":"d9ebc69cd32fea04c5ad0f326e861b49fe2bfdfd0c4f00aff870db5d07fe495b"} Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.846834 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-f5tdv" event={"ID":"b30e9b27-acb7-4df2-b745-482fe080f360","Type":"ContainerStarted","Data":"1c7761b5b2d50de8a15920d4f1291212010e8a0c8bb15e87825baac84ae8e300"} Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.848667 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b89f88dc-7614-4f05-ad26-dc1d46d10b85","Type":"ContainerStarted","Data":"2ae9919b55ea6fd30b82d707cef60e449cf3f36d172a0a37b2ed54d048f11cd9"} Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.848703 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b89f88dc-7614-4f05-ad26-dc1d46d10b85","Type":"ContainerStarted","Data":"53446e5f768017fdd4887557ea55ec3b85fc86be69ff2ffc07053fbe183e219d"} Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.848811 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.851435 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" event={"ID":"3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4","Type":"ContainerDied","Data":"b8842ddc36334176347958e60c83d16d5bd52f2d8d3a54e76a33bfbc8376230a"} Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.851511 4749 scope.go:117] "RemoveContainer" containerID="8b852fdd91afb15c217e32231e5c131a07af4e87476f57409fb1673527101b38" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.851611 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-bs7bl" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.898088 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26pkj\" (UniqueName: \"kubernetes.io/projected/8272956e-b31a-4bd8-9118-3ca9721e6d75-kube-api-access-26pkj\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") " pod="openstack/swift-storage-0" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.898230 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8272956e-b31a-4bd8-9118-3ca9721e6d75-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") " pod="openstack/swift-storage-0" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.898331 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") " pod="openstack/swift-storage-0" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.898385 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8272956e-b31a-4bd8-9118-3ca9721e6d75-etc-swift\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") " pod="openstack/swift-storage-0" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.898424 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8272956e-b31a-4bd8-9118-3ca9721e6d75-lock\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") " pod="openstack/swift-storage-0" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.898507 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8272956e-b31a-4bd8-9118-3ca9721e6d75-cache\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") " pod="openstack/swift-storage-0" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.899198 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8272956e-b31a-4bd8-9118-3ca9721e6d75-cache\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") " pod="openstack/swift-storage-0" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.899358 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/swift-storage-0" Mar 20 07:31:57 crc kubenswrapper[4749]: E0320 07:31:57.899395 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 07:31:57 crc kubenswrapper[4749]: E0320 07:31:57.899421 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 07:31:57 crc kubenswrapper[4749]: E0320 07:31:57.899477 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8272956e-b31a-4bd8-9118-3ca9721e6d75-etc-swift podName:8272956e-b31a-4bd8-9118-3ca9721e6d75 nodeName:}" failed. No retries permitted until 2026-03-20 07:31:58.399455055 +0000 UTC m=+1154.949112722 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8272956e-b31a-4bd8-9118-3ca9721e6d75-etc-swift") pod "swift-storage-0" (UID: "8272956e-b31a-4bd8-9118-3ca9721e6d75") : configmap "swift-ring-files" not found Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.900075 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8272956e-b31a-4bd8-9118-3ca9721e6d75-lock\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") " pod="openstack/swift-storage-0" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.920922 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.6586756390000001 podStartE2EDuration="2.920900536s" podCreationTimestamp="2026-03-20 07:31:55 +0000 UTC" firstStartedPulling="2026-03-20 07:31:56.144534346 +0000 UTC m=+1152.694192003" lastFinishedPulling="2026-03-20 07:31:57.406759253 +0000 UTC m=+1153.956416900" observedRunningTime="2026-03-20 07:31:57.890391885 +0000 UTC m=+1154.440049542" watchObservedRunningTime="2026-03-20 07:31:57.920900536 +0000 UTC m=+1154.470558183" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.929461 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26pkj\" (UniqueName: \"kubernetes.io/projected/8272956e-b31a-4bd8-9118-3ca9721e6d75-kube-api-access-26pkj\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") " pod="openstack/swift-storage-0" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.934053 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") " pod="openstack/swift-storage-0" Mar 20 07:31:57 crc kubenswrapper[4749]: I0320 07:31:57.952841 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8272956e-b31a-4bd8-9118-3ca9721e6d75-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") " pod="openstack/swift-storage-0" Mar 20 07:31:58 crc kubenswrapper[4749]: I0320 07:31:58.017772 4749 scope.go:117] "RemoveContainer" containerID="9e81ae96719597a58ce4ec18589543b8aa2374144c7e245c3e14e1b5c7adc1ea" Mar 20 07:31:58 crc kubenswrapper[4749]: I0320 07:31:58.035772 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-bs7bl"] Mar 20 07:31:58 crc kubenswrapper[4749]: I0320 07:31:58.041978 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-bs7bl"] Mar 20 07:31:58 crc kubenswrapper[4749]: I0320 07:31:58.191507 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4" path="/var/lib/kubelet/pods/3d445ec2-8c9d-4312-9b9e-fcc997d5e0c4/volumes" Mar 20 07:31:58 crc kubenswrapper[4749]: I0320 07:31:58.408798 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8272956e-b31a-4bd8-9118-3ca9721e6d75-etc-swift\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") " pod="openstack/swift-storage-0" Mar 20 07:31:58 crc kubenswrapper[4749]: E0320 07:31:58.408939 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 07:31:58 crc kubenswrapper[4749]: E0320 07:31:58.408960 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 07:31:58 crc kubenswrapper[4749]: E0320 07:31:58.409018 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8272956e-b31a-4bd8-9118-3ca9721e6d75-etc-swift podName:8272956e-b31a-4bd8-9118-3ca9721e6d75 nodeName:}" failed. No retries permitted until 2026-03-20 07:31:59.409000287 +0000 UTC m=+1155.958657934 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8272956e-b31a-4bd8-9118-3ca9721e6d75-etc-swift") pod "swift-storage-0" (UID: "8272956e-b31a-4bd8-9118-3ca9721e6d75") : configmap "swift-ring-files" not found Mar 20 07:31:58 crc kubenswrapper[4749]: I0320 07:31:58.874870 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-f5tdv" event={"ID":"b30e9b27-acb7-4df2-b745-482fe080f360","Type":"ContainerStarted","Data":"cfcbb53039c573bf388a044b8c8a51cccd81128168646e2bdf5cb23961134a3d"} Mar 20 07:31:58 crc kubenswrapper[4749]: I0320 07:31:58.874959 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-f5tdv" Mar 20 07:31:59 crc kubenswrapper[4749]: I0320 07:31:59.424188 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8272956e-b31a-4bd8-9118-3ca9721e6d75-etc-swift\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") " pod="openstack/swift-storage-0" Mar 20 07:31:59 crc kubenswrapper[4749]: E0320 07:31:59.424489 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 07:31:59 crc kubenswrapper[4749]: E0320 07:31:59.425575 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 07:31:59 crc kubenswrapper[4749]: E0320 07:31:59.425714 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8272956e-b31a-4bd8-9118-3ca9721e6d75-etc-swift podName:8272956e-b31a-4bd8-9118-3ca9721e6d75 nodeName:}" failed. No retries permitted until 2026-03-20 07:32:01.425693573 +0000 UTC m=+1157.975351220 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8272956e-b31a-4bd8-9118-3ca9721e6d75-etc-swift") pod "swift-storage-0" (UID: "8272956e-b31a-4bd8-9118-3ca9721e6d75") : configmap "swift-ring-files" not found Mar 20 07:31:59 crc kubenswrapper[4749]: I0320 07:31:59.957314 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 20 07:31:59 crc kubenswrapper[4749]: I0320 07:31:59.989413 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-f5tdv" podStartSLOduration=3.989395112 podStartE2EDuration="3.989395112s" podCreationTimestamp="2026-03-20 07:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:31:58.901471084 +0000 UTC m=+1155.451128731" watchObservedRunningTime="2026-03-20 07:31:59.989395112 +0000 UTC m=+1156.539052759" Mar 20 07:32:00 crc kubenswrapper[4749]: I0320 07:32:00.059572 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 20 07:32:00 crc kubenswrapper[4749]: I0320 07:32:00.154996 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566532-vqmck"] Mar 20 07:32:00 crc kubenswrapper[4749]: I0320 07:32:00.156579 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566532-vqmck" Mar 20 07:32:00 crc kubenswrapper[4749]: I0320 07:32:00.160677 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 07:32:00 crc kubenswrapper[4749]: I0320 07:32:00.163255 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:32:00 crc kubenswrapper[4749]: I0320 07:32:00.163631 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:32:00 crc kubenswrapper[4749]: I0320 07:32:00.192247 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566532-vqmck"] Mar 20 07:32:00 crc kubenswrapper[4749]: I0320 07:32:00.238851 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-749hn\" (UniqueName: \"kubernetes.io/projected/d08b93fb-b8b6-4ec4-a812-fa127e9519ae-kube-api-access-749hn\") pod \"auto-csr-approver-29566532-vqmck\" (UID: \"d08b93fb-b8b6-4ec4-a812-fa127e9519ae\") " pod="openshift-infra/auto-csr-approver-29566532-vqmck" Mar 20 07:32:00 crc kubenswrapper[4749]: I0320 07:32:00.340595 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-749hn\" (UniqueName: \"kubernetes.io/projected/d08b93fb-b8b6-4ec4-a812-fa127e9519ae-kube-api-access-749hn\") pod \"auto-csr-approver-29566532-vqmck\" (UID: \"d08b93fb-b8b6-4ec4-a812-fa127e9519ae\") " pod="openshift-infra/auto-csr-approver-29566532-vqmck" Mar 20 07:32:00 crc kubenswrapper[4749]: I0320 07:32:00.363203 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-749hn\" (UniqueName: \"kubernetes.io/projected/d08b93fb-b8b6-4ec4-a812-fa127e9519ae-kube-api-access-749hn\") pod \"auto-csr-approver-29566532-vqmck\" (UID: \"d08b93fb-b8b6-4ec4-a812-fa127e9519ae\") " pod="openshift-infra/auto-csr-approver-29566532-vqmck" Mar 20 07:32:00 crc kubenswrapper[4749]: I0320 07:32:00.491809 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566532-vqmck" Mar 20 07:32:00 crc kubenswrapper[4749]: I0320 07:32:00.924620 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566532-vqmck"] Mar 20 07:32:00 crc kubenswrapper[4749]: W0320 07:32:00.925002 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd08b93fb_b8b6_4ec4_a812_fa127e9519ae.slice/crio-92a7598dabd62e1e0d796f88514115cfde9946d58e0333940a7ec9c50f10906f WatchSource:0}: Error finding container 92a7598dabd62e1e0d796f88514115cfde9946d58e0333940a7ec9c50f10906f: Status 404 returned error can't find the container with id 92a7598dabd62e1e0d796f88514115cfde9946d58e0333940a7ec9c50f10906f Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.459819 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8272956e-b31a-4bd8-9118-3ca9721e6d75-etc-swift\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") " pod="openstack/swift-storage-0" Mar 20 07:32:01 crc kubenswrapper[4749]: E0320 07:32:01.460067 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 07:32:01 crc kubenswrapper[4749]: E0320 07:32:01.460116 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 07:32:01 crc kubenswrapper[4749]: E0320 07:32:01.460211 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8272956e-b31a-4bd8-9118-3ca9721e6d75-etc-swift podName:8272956e-b31a-4bd8-9118-3ca9721e6d75 nodeName:}" failed. No retries permitted until 2026-03-20 07:32:05.460184301 +0000 UTC m=+1162.009841978 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8272956e-b31a-4bd8-9118-3ca9721e6d75-etc-swift") pod "swift-storage-0" (UID: "8272956e-b31a-4bd8-9118-3ca9721e6d75") : configmap "swift-ring-files" not found Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.573442 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-dgbxq"] Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.574648 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.578482 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.579193 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.581418 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.604420 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-dgbxq"] Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.663627 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-dispersionconf\") pod \"swift-ring-rebalance-dgbxq\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.663673 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-scripts\") pod \"swift-ring-rebalance-dgbxq\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.663727 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-combined-ca-bundle\") pod \"swift-ring-rebalance-dgbxq\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.663750 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-swiftconf\") pod \"swift-ring-rebalance-dgbxq\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.663786 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g454\" (UniqueName: \"kubernetes.io/projected/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-kube-api-access-5g454\") pod \"swift-ring-rebalance-dgbxq\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.663879 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-etc-swift\") pod \"swift-ring-rebalance-dgbxq\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.663928 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-ring-data-devices\") pod \"swift-ring-rebalance-dgbxq\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.765690 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-etc-swift\") pod \"swift-ring-rebalance-dgbxq\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.765752 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-ring-data-devices\") pod \"swift-ring-rebalance-dgbxq\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.765824 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-dispersionconf\") pod \"swift-ring-rebalance-dgbxq\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.765855 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-scripts\") pod \"swift-ring-rebalance-dgbxq\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.765910 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-combined-ca-bundle\") pod \"swift-ring-rebalance-dgbxq\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.765937 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-swiftconf\") pod \"swift-ring-rebalance-dgbxq\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.765963 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g454\" (UniqueName: \"kubernetes.io/projected/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-kube-api-access-5g454\") pod \"swift-ring-rebalance-dgbxq\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.766725 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-etc-swift\") pod \"swift-ring-rebalance-dgbxq\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.767190 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-scripts\") pod \"swift-ring-rebalance-dgbxq\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.767853 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-ring-data-devices\") pod \"swift-ring-rebalance-dgbxq\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.771860 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-swiftconf\") pod \"swift-ring-rebalance-dgbxq\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.772563 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-combined-ca-bundle\") pod \"swift-ring-rebalance-dgbxq\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.773045 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-dispersionconf\") pod \"swift-ring-rebalance-dgbxq\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.781914 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g454\" (UniqueName: \"kubernetes.io/projected/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-kube-api-access-5g454\") pod \"swift-ring-rebalance-dgbxq\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.907372 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566532-vqmck" event={"ID":"d08b93fb-b8b6-4ec4-a812-fa127e9519ae","Type":"ContainerStarted","Data":"92a7598dabd62e1e0d796f88514115cfde9946d58e0333940a7ec9c50f10906f"} Mar 20 07:32:01 crc kubenswrapper[4749]: I0320 07:32:01.934432 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:02 crc kubenswrapper[4749]: I0320 07:32:02.406006 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-dgbxq"] Mar 20 07:32:02 crc kubenswrapper[4749]: W0320 07:32:02.412195 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3adcfcfa_0ea4_4c5e_9e57_957538c1469e.slice/crio-0e90ef62cbc5c8075e53d48cf7cb1c8b5d794d84f6257afbd684759e0231829c WatchSource:0}: Error finding container 0e90ef62cbc5c8075e53d48cf7cb1c8b5d794d84f6257afbd684759e0231829c: Status 404 returned error can't find the container with id 0e90ef62cbc5c8075e53d48cf7cb1c8b5d794d84f6257afbd684759e0231829c Mar 20 07:32:02 crc kubenswrapper[4749]: I0320 07:32:02.491406 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-gmxsx"] Mar 20 07:32:02 crc kubenswrapper[4749]: I0320 07:32:02.494088 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gmxsx" Mar 20 07:32:02 crc kubenswrapper[4749]: I0320 07:32:02.500129 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 07:32:02 crc kubenswrapper[4749]: I0320 07:32:02.503489 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gmxsx"] Mar 20 07:32:02 crc kubenswrapper[4749]: I0320 07:32:02.546943 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 20 07:32:02 crc kubenswrapper[4749]: I0320 07:32:02.546986 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 20 07:32:02 crc kubenswrapper[4749]: I0320 07:32:02.583311 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjl7l\" (UniqueName: \"kubernetes.io/projected/311c3c9f-da96-491a-a2e1-481b567af283-kube-api-access-jjl7l\") pod \"root-account-create-update-gmxsx\" (UID: \"311c3c9f-da96-491a-a2e1-481b567af283\") " pod="openstack/root-account-create-update-gmxsx" Mar 20 07:32:02 crc kubenswrapper[4749]: I0320 07:32:02.583352 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/311c3c9f-da96-491a-a2e1-481b567af283-operator-scripts\") pod \"root-account-create-update-gmxsx\" (UID: \"311c3c9f-da96-491a-a2e1-481b567af283\") " pod="openstack/root-account-create-update-gmxsx" Mar 20 07:32:02 crc kubenswrapper[4749]: I0320 07:32:02.685166 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjl7l\" (UniqueName: \"kubernetes.io/projected/311c3c9f-da96-491a-a2e1-481b567af283-kube-api-access-jjl7l\") pod \"root-account-create-update-gmxsx\" (UID: \"311c3c9f-da96-491a-a2e1-481b567af283\") " pod="openstack/root-account-create-update-gmxsx" Mar 20 07:32:02 crc kubenswrapper[4749]: I0320 07:32:02.685216 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/311c3c9f-da96-491a-a2e1-481b567af283-operator-scripts\") pod \"root-account-create-update-gmxsx\" (UID: \"311c3c9f-da96-491a-a2e1-481b567af283\") " pod="openstack/root-account-create-update-gmxsx" Mar 20 07:32:02 crc kubenswrapper[4749]: I0320 07:32:02.685954 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/311c3c9f-da96-491a-a2e1-481b567af283-operator-scripts\") pod \"root-account-create-update-gmxsx\" (UID: \"311c3c9f-da96-491a-a2e1-481b567af283\") " pod="openstack/root-account-create-update-gmxsx" Mar 20 07:32:02 crc kubenswrapper[4749]: I0320 07:32:02.705756 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjl7l\" (UniqueName: \"kubernetes.io/projected/311c3c9f-da96-491a-a2e1-481b567af283-kube-api-access-jjl7l\") pod \"root-account-create-update-gmxsx\" (UID: \"311c3c9f-da96-491a-a2e1-481b567af283\") " pod="openstack/root-account-create-update-gmxsx" Mar 20 07:32:02 crc kubenswrapper[4749]: I0320 07:32:02.729346 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 20 07:32:02 crc kubenswrapper[4749]: I0320 07:32:02.815336 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gmxsx" Mar 20 07:32:02 crc kubenswrapper[4749]: I0320 07:32:02.921701 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dgbxq" event={"ID":"3adcfcfa-0ea4-4c5e-9e57-957538c1469e","Type":"ContainerStarted","Data":"0e90ef62cbc5c8075e53d48cf7cb1c8b5d794d84f6257afbd684759e0231829c"} Mar 20 07:32:02 crc kubenswrapper[4749]: I0320 07:32:02.923409 4749 generic.go:334] "Generic (PLEG): container finished" podID="d08b93fb-b8b6-4ec4-a812-fa127e9519ae" containerID="5da4b407aba12ab438460c455de94433793fbff7f93e53cae1e23a30e6882c77" exitCode=0 Mar 20 07:32:02 crc kubenswrapper[4749]: I0320 07:32:02.923525 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566532-vqmck" event={"ID":"d08b93fb-b8b6-4ec4-a812-fa127e9519ae","Type":"ContainerDied","Data":"5da4b407aba12ab438460c455de94433793fbff7f93e53cae1e23a30e6882c77"} Mar 20 07:32:02 crc kubenswrapper[4749]: I0320 07:32:02.998322 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 20 07:32:03 crc kubenswrapper[4749]: I0320 07:32:03.241350 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gmxsx"] Mar 20 07:32:03 crc kubenswrapper[4749]: I0320 07:32:03.938541 4749 generic.go:334] "Generic (PLEG): container finished" podID="311c3c9f-da96-491a-a2e1-481b567af283" containerID="a434af2bf39377c27394825b723ff6b8a39c0413b943fd05a8b143ba92ee7dc8" exitCode=0 Mar 20 07:32:03 crc kubenswrapper[4749]: I0320 07:32:03.938641 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gmxsx" event={"ID":"311c3c9f-da96-491a-a2e1-481b567af283","Type":"ContainerDied","Data":"a434af2bf39377c27394825b723ff6b8a39c0413b943fd05a8b143ba92ee7dc8"} Mar 20 07:32:03 crc kubenswrapper[4749]: I0320 07:32:03.939016 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gmxsx" event={"ID":"311c3c9f-da96-491a-a2e1-481b567af283","Type":"ContainerStarted","Data":"eac02a1034c30dc4f8c60c190186d9a1f5d5dc7edeffb35854a1e0437e6b6c60"} Mar 20 07:32:04 crc kubenswrapper[4749]: I0320 07:32:04.317813 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-ds4q2"] Mar 20 07:32:04 crc kubenswrapper[4749]: I0320 07:32:04.319122 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ds4q2" Mar 20 07:32:04 crc kubenswrapper[4749]: I0320 07:32:04.342342 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ds4q2"] Mar 20 07:32:04 crc kubenswrapper[4749]: I0320 07:32:04.420631 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jtgj\" (UniqueName: \"kubernetes.io/projected/d7811a5b-1577-4ecc-b54f-949bc39b0289-kube-api-access-8jtgj\") pod \"glance-db-create-ds4q2\" (UID: \"d7811a5b-1577-4ecc-b54f-949bc39b0289\") " pod="openstack/glance-db-create-ds4q2" Mar 20 07:32:04 crc kubenswrapper[4749]: I0320 07:32:04.420807 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7811a5b-1577-4ecc-b54f-949bc39b0289-operator-scripts\") pod \"glance-db-create-ds4q2\" (UID: \"d7811a5b-1577-4ecc-b54f-949bc39b0289\") " pod="openstack/glance-db-create-ds4q2" Mar 20 07:32:04 crc kubenswrapper[4749]: I0320 07:32:04.430066 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-94fb-account-create-update-qpb8t"] Mar 20 07:32:04 crc kubenswrapper[4749]: I0320 07:32:04.431686 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-94fb-account-create-update-qpb8t" Mar 20 07:32:04 crc kubenswrapper[4749]: I0320 07:32:04.433506 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 20 07:32:04 crc kubenswrapper[4749]: I0320 07:32:04.459622 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-94fb-account-create-update-qpb8t"] Mar 20 07:32:04 crc kubenswrapper[4749]: I0320 07:32:04.514773 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:32:04 crc kubenswrapper[4749]: I0320 07:32:04.515105 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:32:04 crc kubenswrapper[4749]: I0320 07:32:04.522426 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5-operator-scripts\") pod \"glance-94fb-account-create-update-qpb8t\" (UID: \"db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5\") " pod="openstack/glance-94fb-account-create-update-qpb8t" Mar 20 07:32:04 crc kubenswrapper[4749]: I0320 07:32:04.522818 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz5nz\" (UniqueName: \"kubernetes.io/projected/db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5-kube-api-access-fz5nz\") pod \"glance-94fb-account-create-update-qpb8t\" (UID: \"db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5\") " pod="openstack/glance-94fb-account-create-update-qpb8t" Mar 20 07:32:04 crc kubenswrapper[4749]: I0320 07:32:04.522920 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7811a5b-1577-4ecc-b54f-949bc39b0289-operator-scripts\") pod \"glance-db-create-ds4q2\" (UID: \"d7811a5b-1577-4ecc-b54f-949bc39b0289\") " pod="openstack/glance-db-create-ds4q2" Mar 20 07:32:04 crc kubenswrapper[4749]: I0320 07:32:04.523074 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jtgj\" (UniqueName: \"kubernetes.io/projected/d7811a5b-1577-4ecc-b54f-949bc39b0289-kube-api-access-8jtgj\") pod \"glance-db-create-ds4q2\" (UID: \"d7811a5b-1577-4ecc-b54f-949bc39b0289\") " pod="openstack/glance-db-create-ds4q2" Mar 20 07:32:04 crc kubenswrapper[4749]: I0320 07:32:04.523688 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7811a5b-1577-4ecc-b54f-949bc39b0289-operator-scripts\") pod \"glance-db-create-ds4q2\" (UID: \"d7811a5b-1577-4ecc-b54f-949bc39b0289\") " pod="openstack/glance-db-create-ds4q2" Mar 20 07:32:04 crc kubenswrapper[4749]: I0320 07:32:04.558073 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jtgj\" (UniqueName: \"kubernetes.io/projected/d7811a5b-1577-4ecc-b54f-949bc39b0289-kube-api-access-8jtgj\") pod \"glance-db-create-ds4q2\" (UID: \"d7811a5b-1577-4ecc-b54f-949bc39b0289\") " pod="openstack/glance-db-create-ds4q2" Mar 20 07:32:04 crc kubenswrapper[4749]: I0320 07:32:04.625309 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5-operator-scripts\") pod \"glance-94fb-account-create-update-qpb8t\" (UID: \"db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5\") " pod="openstack/glance-94fb-account-create-update-qpb8t" Mar 20 07:32:04 crc kubenswrapper[4749]: I0320 07:32:04.625444 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz5nz\" (UniqueName: \"kubernetes.io/projected/db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5-kube-api-access-fz5nz\") pod \"glance-94fb-account-create-update-qpb8t\" (UID: \"db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5\") " pod="openstack/glance-94fb-account-create-update-qpb8t" Mar 20 07:32:04 crc kubenswrapper[4749]: I0320 07:32:04.626707 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5-operator-scripts\") pod \"glance-94fb-account-create-update-qpb8t\" (UID: \"db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5\") " pod="openstack/glance-94fb-account-create-update-qpb8t" Mar 20 07:32:04 crc kubenswrapper[4749]: I0320 07:32:04.645154 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz5nz\" (UniqueName: \"kubernetes.io/projected/db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5-kube-api-access-fz5nz\") pod \"glance-94fb-account-create-update-qpb8t\" (UID: \"db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5\") " pod="openstack/glance-94fb-account-create-update-qpb8t" Mar 20 07:32:04 crc kubenswrapper[4749]: I0320 07:32:04.652439 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ds4q2" Mar 20 07:32:04 crc kubenswrapper[4749]: I0320 07:32:04.751065 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-94fb-account-create-update-qpb8t" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.336691 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-vvwnz"] Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.337931 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vvwnz" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.343038 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b25f-account-create-update-4prvn"] Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.344121 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b25f-account-create-update-4prvn" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.346113 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.351409 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-vvwnz"] Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.358992 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b25f-account-create-update-4prvn"] Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.437065 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dae8082-8d1a-448c-961f-bf0c58f0bd81-operator-scripts\") pod \"keystone-b25f-account-create-update-4prvn\" (UID: \"5dae8082-8d1a-448c-961f-bf0c58f0bd81\") " pod="openstack/keystone-b25f-account-create-update-4prvn" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.437166 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzkx2\" (UniqueName: \"kubernetes.io/projected/f61da3ae-a72f-4d88-b8cc-38d0503649d8-kube-api-access-tzkx2\") pod \"keystone-db-create-vvwnz\" (UID: \"f61da3ae-a72f-4d88-b8cc-38d0503649d8\") " pod="openstack/keystone-db-create-vvwnz" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.437223 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks6k8\" (UniqueName: \"kubernetes.io/projected/5dae8082-8d1a-448c-961f-bf0c58f0bd81-kube-api-access-ks6k8\") pod \"keystone-b25f-account-create-update-4prvn\" (UID: \"5dae8082-8d1a-448c-961f-bf0c58f0bd81\") " pod="openstack/keystone-b25f-account-create-update-4prvn" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.437244 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f61da3ae-a72f-4d88-b8cc-38d0503649d8-operator-scripts\") pod \"keystone-db-create-vvwnz\" (UID: \"f61da3ae-a72f-4d88-b8cc-38d0503649d8\") " pod="openstack/keystone-db-create-vvwnz" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.467873 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-6cllh"] Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.468851 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6cllh" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.486316 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6cllh"] Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.501928 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-563c-account-create-update-xjv9j"] Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.503132 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-563c-account-create-update-xjv9j" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.505615 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.530693 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-563c-account-create-update-xjv9j"] Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.538333 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzkx2\" (UniqueName: \"kubernetes.io/projected/f61da3ae-a72f-4d88-b8cc-38d0503649d8-kube-api-access-tzkx2\") pod \"keystone-db-create-vvwnz\" (UID: \"f61da3ae-a72f-4d88-b8cc-38d0503649d8\") " pod="openstack/keystone-db-create-vvwnz" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.538415 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks6k8\" (UniqueName: \"kubernetes.io/projected/5dae8082-8d1a-448c-961f-bf0c58f0bd81-kube-api-access-ks6k8\") pod \"keystone-b25f-account-create-update-4prvn\" (UID: \"5dae8082-8d1a-448c-961f-bf0c58f0bd81\") " pod="openstack/keystone-b25f-account-create-update-4prvn" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.538443 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f61da3ae-a72f-4d88-b8cc-38d0503649d8-operator-scripts\") pod \"keystone-db-create-vvwnz\" (UID: \"f61da3ae-a72f-4d88-b8cc-38d0503649d8\") " pod="openstack/keystone-db-create-vvwnz" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.538487 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f48a0db5-3834-46dd-a959-a6a4e67fc1dd-operator-scripts\") pod \"placement-db-create-6cllh\" (UID: \"f48a0db5-3834-46dd-a959-a6a4e67fc1dd\") " pod="openstack/placement-db-create-6cllh" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.538511 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knvgt\" (UniqueName: \"kubernetes.io/projected/f48a0db5-3834-46dd-a959-a6a4e67fc1dd-kube-api-access-knvgt\") pod \"placement-db-create-6cllh\" (UID: \"f48a0db5-3834-46dd-a959-a6a4e67fc1dd\") " pod="openstack/placement-db-create-6cllh" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.538550 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8272956e-b31a-4bd8-9118-3ca9721e6d75-etc-swift\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") " pod="openstack/swift-storage-0" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.538574 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dae8082-8d1a-448c-961f-bf0c58f0bd81-operator-scripts\") pod \"keystone-b25f-account-create-update-4prvn\" (UID: \"5dae8082-8d1a-448c-961f-bf0c58f0bd81\") " pod="openstack/keystone-b25f-account-create-update-4prvn" Mar 20 07:32:05 crc kubenswrapper[4749]: E0320 07:32:05.538742 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 07:32:05 crc kubenswrapper[4749]: E0320 07:32:05.538763 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 07:32:05 crc kubenswrapper[4749]: E0320 07:32:05.538809 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8272956e-b31a-4bd8-9118-3ca9721e6d75-etc-swift podName:8272956e-b31a-4bd8-9118-3ca9721e6d75 nodeName:}" failed. No retries permitted until 2026-03-20 07:32:13.538790132 +0000 UTC m=+1170.088447779 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8272956e-b31a-4bd8-9118-3ca9721e6d75-etc-swift") pod "swift-storage-0" (UID: "8272956e-b31a-4bd8-9118-3ca9721e6d75") : configmap "swift-ring-files" not found Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.539495 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f61da3ae-a72f-4d88-b8cc-38d0503649d8-operator-scripts\") pod \"keystone-db-create-vvwnz\" (UID: \"f61da3ae-a72f-4d88-b8cc-38d0503649d8\") " pod="openstack/keystone-db-create-vvwnz" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.539693 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dae8082-8d1a-448c-961f-bf0c58f0bd81-operator-scripts\") pod \"keystone-b25f-account-create-update-4prvn\" (UID: \"5dae8082-8d1a-448c-961f-bf0c58f0bd81\") " pod="openstack/keystone-b25f-account-create-update-4prvn" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.557150 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzkx2\" (UniqueName: \"kubernetes.io/projected/f61da3ae-a72f-4d88-b8cc-38d0503649d8-kube-api-access-tzkx2\") pod \"keystone-db-create-vvwnz\" (UID: \"f61da3ae-a72f-4d88-b8cc-38d0503649d8\") " pod="openstack/keystone-db-create-vvwnz" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.557893 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks6k8\" (UniqueName: \"kubernetes.io/projected/5dae8082-8d1a-448c-961f-bf0c58f0bd81-kube-api-access-ks6k8\") pod \"keystone-b25f-account-create-update-4prvn\" (UID: \"5dae8082-8d1a-448c-961f-bf0c58f0bd81\") " pod="openstack/keystone-b25f-account-create-update-4prvn" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.639723 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ddc4f6-6f60-489c-bedb-44a31de6894e-operator-scripts\") pod \"placement-563c-account-create-update-xjv9j\" (UID: \"e9ddc4f6-6f60-489c-bedb-44a31de6894e\") " pod="openstack/placement-563c-account-create-update-xjv9j" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.639824 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f48a0db5-3834-46dd-a959-a6a4e67fc1dd-operator-scripts\") pod \"placement-db-create-6cllh\" (UID: \"f48a0db5-3834-46dd-a959-a6a4e67fc1dd\") " pod="openstack/placement-db-create-6cllh" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.639862 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knvgt\" (UniqueName: \"kubernetes.io/projected/f48a0db5-3834-46dd-a959-a6a4e67fc1dd-kube-api-access-knvgt\") pod \"placement-db-create-6cllh\" (UID: \"f48a0db5-3834-46dd-a959-a6a4e67fc1dd\") " pod="openstack/placement-db-create-6cllh" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.639945 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blzd7\" (UniqueName: \"kubernetes.io/projected/e9ddc4f6-6f60-489c-bedb-44a31de6894e-kube-api-access-blzd7\") pod \"placement-563c-account-create-update-xjv9j\" (UID: \"e9ddc4f6-6f60-489c-bedb-44a31de6894e\") " pod="openstack/placement-563c-account-create-update-xjv9j" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.641140 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f48a0db5-3834-46dd-a959-a6a4e67fc1dd-operator-scripts\") pod \"placement-db-create-6cllh\" (UID: \"f48a0db5-3834-46dd-a959-a6a4e67fc1dd\") " pod="openstack/placement-db-create-6cllh" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.661806 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knvgt\" (UniqueName: \"kubernetes.io/projected/f48a0db5-3834-46dd-a959-a6a4e67fc1dd-kube-api-access-knvgt\") pod \"placement-db-create-6cllh\" (UID: \"f48a0db5-3834-46dd-a959-a6a4e67fc1dd\") " pod="openstack/placement-db-create-6cllh" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.670993 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vvwnz" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.685690 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b25f-account-create-update-4prvn" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.741927 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blzd7\" (UniqueName: \"kubernetes.io/projected/e9ddc4f6-6f60-489c-bedb-44a31de6894e-kube-api-access-blzd7\") pod \"placement-563c-account-create-update-xjv9j\" (UID: \"e9ddc4f6-6f60-489c-bedb-44a31de6894e\") " pod="openstack/placement-563c-account-create-update-xjv9j" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.742038 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ddc4f6-6f60-489c-bedb-44a31de6894e-operator-scripts\") pod \"placement-563c-account-create-update-xjv9j\" (UID: \"e9ddc4f6-6f60-489c-bedb-44a31de6894e\") " pod="openstack/placement-563c-account-create-update-xjv9j" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.742881 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ddc4f6-6f60-489c-bedb-44a31de6894e-operator-scripts\") pod \"placement-563c-account-create-update-xjv9j\" (UID: \"e9ddc4f6-6f60-489c-bedb-44a31de6894e\") " pod="openstack/placement-563c-account-create-update-xjv9j" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.762365 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blzd7\" (UniqueName: \"kubernetes.io/projected/e9ddc4f6-6f60-489c-bedb-44a31de6894e-kube-api-access-blzd7\") pod \"placement-563c-account-create-update-xjv9j\" (UID: \"e9ddc4f6-6f60-489c-bedb-44a31de6894e\") " pod="openstack/placement-563c-account-create-update-xjv9j" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.805361 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6cllh" Mar 20 07:32:05 crc kubenswrapper[4749]: I0320 07:32:05.830446 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-563c-account-create-update-xjv9j" Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.328348 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gmxsx" Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.335078 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566532-vqmck" Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.464083 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-749hn\" (UniqueName: \"kubernetes.io/projected/d08b93fb-b8b6-4ec4-a812-fa127e9519ae-kube-api-access-749hn\") pod \"d08b93fb-b8b6-4ec4-a812-fa127e9519ae\" (UID: \"d08b93fb-b8b6-4ec4-a812-fa127e9519ae\") " Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.464165 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/311c3c9f-da96-491a-a2e1-481b567af283-operator-scripts\") pod \"311c3c9f-da96-491a-a2e1-481b567af283\" (UID: \"311c3c9f-da96-491a-a2e1-481b567af283\") " Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.464359 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjl7l\" (UniqueName: \"kubernetes.io/projected/311c3c9f-da96-491a-a2e1-481b567af283-kube-api-access-jjl7l\") pod \"311c3c9f-da96-491a-a2e1-481b567af283\" (UID: \"311c3c9f-da96-491a-a2e1-481b567af283\") " Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.465131 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/311c3c9f-da96-491a-a2e1-481b567af283-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "311c3c9f-da96-491a-a2e1-481b567af283" (UID: "311c3c9f-da96-491a-a2e1-481b567af283"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.471654 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/311c3c9f-da96-491a-a2e1-481b567af283-kube-api-access-jjl7l" (OuterVolumeSpecName: "kube-api-access-jjl7l") pod "311c3c9f-da96-491a-a2e1-481b567af283" (UID: "311c3c9f-da96-491a-a2e1-481b567af283"). InnerVolumeSpecName "kube-api-access-jjl7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.476217 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d08b93fb-b8b6-4ec4-a812-fa127e9519ae-kube-api-access-749hn" (OuterVolumeSpecName: "kube-api-access-749hn") pod "d08b93fb-b8b6-4ec4-a812-fa127e9519ae" (UID: "d08b93fb-b8b6-4ec4-a812-fa127e9519ae"). InnerVolumeSpecName "kube-api-access-749hn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.566670 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjl7l\" (UniqueName: \"kubernetes.io/projected/311c3c9f-da96-491a-a2e1-481b567af283-kube-api-access-jjl7l\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.566926 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-749hn\" (UniqueName: \"kubernetes.io/projected/d08b93fb-b8b6-4ec4-a812-fa127e9519ae-kube-api-access-749hn\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.566937 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/311c3c9f-da96-491a-a2e1-481b567af283-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.865448 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-f5tdv" Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.874357 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ds4q2"] Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.881613 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-563c-account-create-update-xjv9j"] Mar 20 07:32:06 crc kubenswrapper[4749]: W0320 07:32:06.899848 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf48a0db5_3834_46dd_a959_a6a4e67fc1dd.slice/crio-b53bba30f6c46d8b88b60a4b5286388894120ed0ce4f84e26f2aeb4f609cf7d0 WatchSource:0}: Error finding container b53bba30f6c46d8b88b60a4b5286388894120ed0ce4f84e26f2aeb4f609cf7d0: Status 404 returned error can't find the container with id b53bba30f6c46d8b88b60a4b5286388894120ed0ce4f84e26f2aeb4f609cf7d0 Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.915943 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6cllh"] Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.949149 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-b628q"] Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.949670 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-b628q" podUID="dc41dd13-79db-4f94-a11b-bc0cf369bb76" containerName="dnsmasq-dns" containerID="cri-o://ca58e44aa9e9389828aedd3563fe417bbd6fd7d1c1ff6a1a879bebe6d3529496" gracePeriod=10 Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.979150 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dgbxq" event={"ID":"3adcfcfa-0ea4-4c5e-9e57-957538c1469e","Type":"ContainerStarted","Data":"d887667291809460ec9e6fd870e34f27450b46787cbe5148eaf7cc5ab07c1296"} Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.983405 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ds4q2" event={"ID":"d7811a5b-1577-4ecc-b54f-949bc39b0289","Type":"ContainerStarted","Data":"252fbb69cf0eabb313b08a486ae2e0f5bf00a781a2ce62da13da71896ac24e13"} Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.985138 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566532-vqmck" event={"ID":"d08b93fb-b8b6-4ec4-a812-fa127e9519ae","Type":"ContainerDied","Data":"92a7598dabd62e1e0d796f88514115cfde9946d58e0333940a7ec9c50f10906f"} Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.985163 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92a7598dabd62e1e0d796f88514115cfde9946d58e0333940a7ec9c50f10906f" Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.985143 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566532-vqmck" Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.991793 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gmxsx" event={"ID":"311c3c9f-da96-491a-a2e1-481b567af283","Type":"ContainerDied","Data":"eac02a1034c30dc4f8c60c190186d9a1f5d5dc7edeffb35854a1e0437e6b6c60"} Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.991828 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gmxsx" Mar 20 07:32:06 crc kubenswrapper[4749]: I0320 07:32:06.991838 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eac02a1034c30dc4f8c60c190186d9a1f5d5dc7edeffb35854a1e0437e6b6c60" Mar 20 07:32:07 crc kubenswrapper[4749]: I0320 07:32:07.002957 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-dgbxq" podStartSLOduration=2.038689179 podStartE2EDuration="6.00294211s" podCreationTimestamp="2026-03-20 07:32:01 +0000 UTC" firstStartedPulling="2026-03-20 07:32:02.415262448 +0000 UTC m=+1158.964920095" lastFinishedPulling="2026-03-20 07:32:06.379515379 +0000 UTC m=+1162.929173026" observedRunningTime="2026-03-20 07:32:07.002049718 +0000 UTC m=+1163.551707365" watchObservedRunningTime="2026-03-20 07:32:07.00294211 +0000 UTC m=+1163.552599757" Mar 20 07:32:07 crc kubenswrapper[4749]: I0320 07:32:07.025274 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6cllh" event={"ID":"f48a0db5-3834-46dd-a959-a6a4e67fc1dd","Type":"ContainerStarted","Data":"b53bba30f6c46d8b88b60a4b5286388894120ed0ce4f84e26f2aeb4f609cf7d0"} Mar 20 07:32:07 crc kubenswrapper[4749]: I0320 07:32:07.041066 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-94fb-account-create-update-qpb8t"] Mar 20 07:32:07 crc kubenswrapper[4749]: W0320 07:32:07.053113 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf61da3ae_a72f_4d88_b8cc_38d0503649d8.slice/crio-159e3e318ced0639ac0dfce98f9808b4340ae72da4cc8286aa44eff592af931e WatchSource:0}: Error finding container 159e3e318ced0639ac0dfce98f9808b4340ae72da4cc8286aa44eff592af931e: Status 404 returned error can't find the container with id 159e3e318ced0639ac0dfce98f9808b4340ae72da4cc8286aa44eff592af931e Mar 20 07:32:07 crc kubenswrapper[4749]: I0320 07:32:07.068467 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-vvwnz"] Mar 20 07:32:07 crc kubenswrapper[4749]: I0320 07:32:07.102436 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b25f-account-create-update-4prvn"] Mar 20 07:32:07 crc kubenswrapper[4749]: I0320 07:32:07.398402 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-b628q" Mar 20 07:32:07 crc kubenswrapper[4749]: I0320 07:32:07.421856 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566526-62x7p"] Mar 20 07:32:07 crc kubenswrapper[4749]: I0320 07:32:07.433772 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566526-62x7p"] Mar 20 07:32:07 crc kubenswrapper[4749]: I0320 07:32:07.492797 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm7tg\" (UniqueName: \"kubernetes.io/projected/dc41dd13-79db-4f94-a11b-bc0cf369bb76-kube-api-access-hm7tg\") pod \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\" (UID: \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\") " Mar 20 07:32:07 crc kubenswrapper[4749]: I0320 07:32:07.492865 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc41dd13-79db-4f94-a11b-bc0cf369bb76-ovsdbserver-sb\") pod \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\" (UID: \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\") " Mar 20 07:32:07 crc kubenswrapper[4749]: I0320 07:32:07.492907 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc41dd13-79db-4f94-a11b-bc0cf369bb76-ovsdbserver-nb\") pod \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\" (UID: \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\") " Mar 20 07:32:07 crc kubenswrapper[4749]: I0320 07:32:07.493099 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc41dd13-79db-4f94-a11b-bc0cf369bb76-dns-svc\") pod \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\" (UID: \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\") " Mar 20 07:32:07 crc kubenswrapper[4749]: I0320 07:32:07.493124 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc41dd13-79db-4f94-a11b-bc0cf369bb76-config\") pod \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\" (UID: \"dc41dd13-79db-4f94-a11b-bc0cf369bb76\") " Mar 20 07:32:07 crc kubenswrapper[4749]: I0320 07:32:07.502453 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc41dd13-79db-4f94-a11b-bc0cf369bb76-kube-api-access-hm7tg" (OuterVolumeSpecName: "kube-api-access-hm7tg") pod "dc41dd13-79db-4f94-a11b-bc0cf369bb76" (UID: "dc41dd13-79db-4f94-a11b-bc0cf369bb76"). InnerVolumeSpecName "kube-api-access-hm7tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:32:07 crc kubenswrapper[4749]: I0320 07:32:07.536961 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc41dd13-79db-4f94-a11b-bc0cf369bb76-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dc41dd13-79db-4f94-a11b-bc0cf369bb76" (UID: "dc41dd13-79db-4f94-a11b-bc0cf369bb76"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:32:07 crc kubenswrapper[4749]: I0320 07:32:07.536990 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc41dd13-79db-4f94-a11b-bc0cf369bb76-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dc41dd13-79db-4f94-a11b-bc0cf369bb76" (UID: "dc41dd13-79db-4f94-a11b-bc0cf369bb76"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:32:07 crc kubenswrapper[4749]: I0320 07:32:07.544665 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc41dd13-79db-4f94-a11b-bc0cf369bb76-config" (OuterVolumeSpecName: "config") pod "dc41dd13-79db-4f94-a11b-bc0cf369bb76" (UID: "dc41dd13-79db-4f94-a11b-bc0cf369bb76"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:32:07 crc kubenswrapper[4749]: I0320 07:32:07.558851 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc41dd13-79db-4f94-a11b-bc0cf369bb76-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dc41dd13-79db-4f94-a11b-bc0cf369bb76" (UID: "dc41dd13-79db-4f94-a11b-bc0cf369bb76"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:32:07 crc kubenswrapper[4749]: I0320 07:32:07.595516 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm7tg\" (UniqueName: \"kubernetes.io/projected/dc41dd13-79db-4f94-a11b-bc0cf369bb76-kube-api-access-hm7tg\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:07 crc kubenswrapper[4749]: I0320 07:32:07.595557 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc41dd13-79db-4f94-a11b-bc0cf369bb76-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:07 crc kubenswrapper[4749]: I0320 07:32:07.595568 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc41dd13-79db-4f94-a11b-bc0cf369bb76-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:07 crc kubenswrapper[4749]: I0320 07:32:07.595582 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc41dd13-79db-4f94-a11b-bc0cf369bb76-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:07 crc kubenswrapper[4749]: I0320 07:32:07.595591 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc41dd13-79db-4f94-a11b-bc0cf369bb76-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.033301 4749 generic.go:334] "Generic (PLEG): container finished" podID="dc41dd13-79db-4f94-a11b-bc0cf369bb76" containerID="ca58e44aa9e9389828aedd3563fe417bbd6fd7d1c1ff6a1a879bebe6d3529496" exitCode=0 Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.033372 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-b628q" event={"ID":"dc41dd13-79db-4f94-a11b-bc0cf369bb76","Type":"ContainerDied","Data":"ca58e44aa9e9389828aedd3563fe417bbd6fd7d1c1ff6a1a879bebe6d3529496"} Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.033383 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-b628q" Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.033398 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-b628q" event={"ID":"dc41dd13-79db-4f94-a11b-bc0cf369bb76","Type":"ContainerDied","Data":"60834df33a440ec4f4e87dcab4614a7222f8d5230c9617b63bf99abc593dbf59"} Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.033432 4749 scope.go:117] "RemoveContainer" containerID="ca58e44aa9e9389828aedd3563fe417bbd6fd7d1c1ff6a1a879bebe6d3529496" Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.035070 4749 generic.go:334] "Generic (PLEG): container finished" podID="db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5" containerID="80d77ee4a1bf66d80163f8a5526304fdccc195168842114249293e02583e9d40" exitCode=0 Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.035200 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-94fb-account-create-update-qpb8t" event={"ID":"db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5","Type":"ContainerDied","Data":"80d77ee4a1bf66d80163f8a5526304fdccc195168842114249293e02583e9d40"} Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.035249 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-94fb-account-create-update-qpb8t" event={"ID":"db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5","Type":"ContainerStarted","Data":"5bebc714906e2d3744774dafd3bf2208504019418e1d00b10b0aeaf1da72ee8f"} Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.037909 4749 generic.go:334] "Generic (PLEG): container finished" podID="f48a0db5-3834-46dd-a959-a6a4e67fc1dd" containerID="ffdce9f2c2b0948fa7c96d51232cc816332f88386d3cfa8e8e6a1162fd250742" exitCode=0 Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.037968 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6cllh" event={"ID":"f48a0db5-3834-46dd-a959-a6a4e67fc1dd","Type":"ContainerDied","Data":"ffdce9f2c2b0948fa7c96d51232cc816332f88386d3cfa8e8e6a1162fd250742"} Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.040492 4749 generic.go:334] "Generic (PLEG): container finished" podID="5dae8082-8d1a-448c-961f-bf0c58f0bd81" containerID="4ad168f5534165dde867123251fc544a3c2c5edd7514e8ed47cd8f3022b4a2a9" exitCode=0 Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.040577 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b25f-account-create-update-4prvn" event={"ID":"5dae8082-8d1a-448c-961f-bf0c58f0bd81","Type":"ContainerDied","Data":"4ad168f5534165dde867123251fc544a3c2c5edd7514e8ed47cd8f3022b4a2a9"} Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.040637 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b25f-account-create-update-4prvn" event={"ID":"5dae8082-8d1a-448c-961f-bf0c58f0bd81","Type":"ContainerStarted","Data":"aedc4733bbee3effd6b731f742b1d33e66c857a25c3acca78bd6cc6d6ee40d5c"} Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.043646 4749 generic.go:334] "Generic (PLEG): container finished" podID="d7811a5b-1577-4ecc-b54f-949bc39b0289" containerID="b1cc3cc1324d1ac28122d37be91909fea04a11b4e88444c7ab5db72214f77e7e" exitCode=0 Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.043673 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ds4q2" event={"ID":"d7811a5b-1577-4ecc-b54f-949bc39b0289","Type":"ContainerDied","Data":"b1cc3cc1324d1ac28122d37be91909fea04a11b4e88444c7ab5db72214f77e7e"} Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.045180 4749 generic.go:334] "Generic (PLEG): container finished" podID="f61da3ae-a72f-4d88-b8cc-38d0503649d8" containerID="ff4e7b076ebfd76e033599e596be9afea75dd61efe5c4cfab47bde686b3cad6a" exitCode=0 Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.045218 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vvwnz" event={"ID":"f61da3ae-a72f-4d88-b8cc-38d0503649d8","Type":"ContainerDied","Data":"ff4e7b076ebfd76e033599e596be9afea75dd61efe5c4cfab47bde686b3cad6a"} Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.045242 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vvwnz" event={"ID":"f61da3ae-a72f-4d88-b8cc-38d0503649d8","Type":"ContainerStarted","Data":"159e3e318ced0639ac0dfce98f9808b4340ae72da4cc8286aa44eff592af931e"} Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.046955 4749 generic.go:334] "Generic (PLEG): container finished" podID="e9ddc4f6-6f60-489c-bedb-44a31de6894e" containerID="cb8d13cd73d35aab6a3382d05cc06e4ed46c7b05ec3a43313054396575f48876" exitCode=0 Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.047539 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-563c-account-create-update-xjv9j" event={"ID":"e9ddc4f6-6f60-489c-bedb-44a31de6894e","Type":"ContainerDied","Data":"cb8d13cd73d35aab6a3382d05cc06e4ed46c7b05ec3a43313054396575f48876"} Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.047566 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-563c-account-create-update-xjv9j" event={"ID":"e9ddc4f6-6f60-489c-bedb-44a31de6894e","Type":"ContainerStarted","Data":"83d9ee994aee160850d6f331b089e199481605431feff27672b91adb5e753fd1"} Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.071500 4749 scope.go:117] "RemoveContainer" containerID="2644c3607f0576b1dc2cf9c609f794a4de541fac2b85b056baab58e0a9bad0f6" Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.118233 4749 scope.go:117] "RemoveContainer" containerID="ca58e44aa9e9389828aedd3563fe417bbd6fd7d1c1ff6a1a879bebe6d3529496" Mar 20 07:32:08 crc kubenswrapper[4749]: E0320 07:32:08.126812 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca58e44aa9e9389828aedd3563fe417bbd6fd7d1c1ff6a1a879bebe6d3529496\": container with ID starting with ca58e44aa9e9389828aedd3563fe417bbd6fd7d1c1ff6a1a879bebe6d3529496 not found: ID does not exist" containerID="ca58e44aa9e9389828aedd3563fe417bbd6fd7d1c1ff6a1a879bebe6d3529496" Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.128452 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca58e44aa9e9389828aedd3563fe417bbd6fd7d1c1ff6a1a879bebe6d3529496"} err="failed to get container status \"ca58e44aa9e9389828aedd3563fe417bbd6fd7d1c1ff6a1a879bebe6d3529496\": rpc error: code = NotFound desc = could not find container \"ca58e44aa9e9389828aedd3563fe417bbd6fd7d1c1ff6a1a879bebe6d3529496\": container with ID starting with ca58e44aa9e9389828aedd3563fe417bbd6fd7d1c1ff6a1a879bebe6d3529496 not found: ID does not exist" Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.128543 4749 scope.go:117] "RemoveContainer" containerID="2644c3607f0576b1dc2cf9c609f794a4de541fac2b85b056baab58e0a9bad0f6" Mar 20 07:32:08 crc kubenswrapper[4749]: E0320 07:32:08.129732 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2644c3607f0576b1dc2cf9c609f794a4de541fac2b85b056baab58e0a9bad0f6\": container with ID starting with 2644c3607f0576b1dc2cf9c609f794a4de541fac2b85b056baab58e0a9bad0f6 not found: ID does not exist" containerID="2644c3607f0576b1dc2cf9c609f794a4de541fac2b85b056baab58e0a9bad0f6" Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.129770 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2644c3607f0576b1dc2cf9c609f794a4de541fac2b85b056baab58e0a9bad0f6"} err="failed to get container status \"2644c3607f0576b1dc2cf9c609f794a4de541fac2b85b056baab58e0a9bad0f6\": rpc error: code = NotFound desc = could not find container \"2644c3607f0576b1dc2cf9c609f794a4de541fac2b85b056baab58e0a9bad0f6\": container with ID starting with 2644c3607f0576b1dc2cf9c609f794a4de541fac2b85b056baab58e0a9bad0f6 not found: ID does not exist" Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.208693 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc55663c-b6fa-419c-a9a3-2f4234b8f27d" path="/var/lib/kubelet/pods/fc55663c-b6fa-419c-a9a3-2f4234b8f27d/volumes" Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.209967 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-b628q"] Mar 20 07:32:08 crc kubenswrapper[4749]: I0320 07:32:08.212561 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-b628q"] Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.394668 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6cllh" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.550827 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knvgt\" (UniqueName: \"kubernetes.io/projected/f48a0db5-3834-46dd-a959-a6a4e67fc1dd-kube-api-access-knvgt\") pod \"f48a0db5-3834-46dd-a959-a6a4e67fc1dd\" (UID: \"f48a0db5-3834-46dd-a959-a6a4e67fc1dd\") " Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.551005 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f48a0db5-3834-46dd-a959-a6a4e67fc1dd-operator-scripts\") pod \"f48a0db5-3834-46dd-a959-a6a4e67fc1dd\" (UID: \"f48a0db5-3834-46dd-a959-a6a4e67fc1dd\") " Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.552437 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f48a0db5-3834-46dd-a959-a6a4e67fc1dd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f48a0db5-3834-46dd-a959-a6a4e67fc1dd" (UID: "f48a0db5-3834-46dd-a959-a6a4e67fc1dd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.571089 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f48a0db5-3834-46dd-a959-a6a4e67fc1dd-kube-api-access-knvgt" (OuterVolumeSpecName: "kube-api-access-knvgt") pod "f48a0db5-3834-46dd-a959-a6a4e67fc1dd" (UID: "f48a0db5-3834-46dd-a959-a6a4e67fc1dd"). InnerVolumeSpecName "kube-api-access-knvgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.653166 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f48a0db5-3834-46dd-a959-a6a4e67fc1dd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.653208 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knvgt\" (UniqueName: \"kubernetes.io/projected/f48a0db5-3834-46dd-a959-a6a4e67fc1dd-kube-api-access-knvgt\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.675340 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-563c-account-create-update-xjv9j" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.681130 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b25f-account-create-update-4prvn" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.692426 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-94fb-account-create-update-qpb8t" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.700653 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ds4q2" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.707342 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vvwnz" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.855124 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks6k8\" (UniqueName: \"kubernetes.io/projected/5dae8082-8d1a-448c-961f-bf0c58f0bd81-kube-api-access-ks6k8\") pod \"5dae8082-8d1a-448c-961f-bf0c58f0bd81\" (UID: \"5dae8082-8d1a-448c-961f-bf0c58f0bd81\") " Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.855263 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f61da3ae-a72f-4d88-b8cc-38d0503649d8-operator-scripts\") pod \"f61da3ae-a72f-4d88-b8cc-38d0503649d8\" (UID: \"f61da3ae-a72f-4d88-b8cc-38d0503649d8\") " Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.855373 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blzd7\" (UniqueName: \"kubernetes.io/projected/e9ddc4f6-6f60-489c-bedb-44a31de6894e-kube-api-access-blzd7\") pod \"e9ddc4f6-6f60-489c-bedb-44a31de6894e\" (UID: \"e9ddc4f6-6f60-489c-bedb-44a31de6894e\") " Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.855469 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5-operator-scripts\") pod \"db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5\" (UID: \"db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5\") " Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.855505 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ddc4f6-6f60-489c-bedb-44a31de6894e-operator-scripts\") pod \"e9ddc4f6-6f60-489c-bedb-44a31de6894e\" (UID: \"e9ddc4f6-6f60-489c-bedb-44a31de6894e\") " Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.855559 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dae8082-8d1a-448c-961f-bf0c58f0bd81-operator-scripts\") pod \"5dae8082-8d1a-448c-961f-bf0c58f0bd81\" (UID: \"5dae8082-8d1a-448c-961f-bf0c58f0bd81\") " Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.855603 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz5nz\" (UniqueName: \"kubernetes.io/projected/db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5-kube-api-access-fz5nz\") pod \"db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5\" (UID: \"db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5\") " Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.855666 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzkx2\" (UniqueName: \"kubernetes.io/projected/f61da3ae-a72f-4d88-b8cc-38d0503649d8-kube-api-access-tzkx2\") pod \"f61da3ae-a72f-4d88-b8cc-38d0503649d8\" (UID: \"f61da3ae-a72f-4d88-b8cc-38d0503649d8\") " Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.855748 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7811a5b-1577-4ecc-b54f-949bc39b0289-operator-scripts\") pod \"d7811a5b-1577-4ecc-b54f-949bc39b0289\" (UID: \"d7811a5b-1577-4ecc-b54f-949bc39b0289\") " Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.855841 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jtgj\" (UniqueName: \"kubernetes.io/projected/d7811a5b-1577-4ecc-b54f-949bc39b0289-kube-api-access-8jtgj\") pod \"d7811a5b-1577-4ecc-b54f-949bc39b0289\" (UID: \"d7811a5b-1577-4ecc-b54f-949bc39b0289\") " Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.856087 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dae8082-8d1a-448c-961f-bf0c58f0bd81-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5dae8082-8d1a-448c-961f-bf0c58f0bd81" (UID: "5dae8082-8d1a-448c-961f-bf0c58f0bd81"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.856189 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f61da3ae-a72f-4d88-b8cc-38d0503649d8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f61da3ae-a72f-4d88-b8cc-38d0503649d8" (UID: "f61da3ae-a72f-4d88-b8cc-38d0503649d8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.856448 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9ddc4f6-6f60-489c-bedb-44a31de6894e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e9ddc4f6-6f60-489c-bedb-44a31de6894e" (UID: "e9ddc4f6-6f60-489c-bedb-44a31de6894e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.856597 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f61da3ae-a72f-4d88-b8cc-38d0503649d8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.856619 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ddc4f6-6f60-489c-bedb-44a31de6894e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.856629 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dae8082-8d1a-448c-961f-bf0c58f0bd81-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.856811 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7811a5b-1577-4ecc-b54f-949bc39b0289-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d7811a5b-1577-4ecc-b54f-949bc39b0289" (UID: "d7811a5b-1577-4ecc-b54f-949bc39b0289"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.856810 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5" (UID: "db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.859869 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9ddc4f6-6f60-489c-bedb-44a31de6894e-kube-api-access-blzd7" (OuterVolumeSpecName: "kube-api-access-blzd7") pod "e9ddc4f6-6f60-489c-bedb-44a31de6894e" (UID: "e9ddc4f6-6f60-489c-bedb-44a31de6894e"). InnerVolumeSpecName "kube-api-access-blzd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.860719 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5-kube-api-access-fz5nz" (OuterVolumeSpecName: "kube-api-access-fz5nz") pod "db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5" (UID: "db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5"). InnerVolumeSpecName "kube-api-access-fz5nz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.860824 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dae8082-8d1a-448c-961f-bf0c58f0bd81-kube-api-access-ks6k8" (OuterVolumeSpecName: "kube-api-access-ks6k8") pod "5dae8082-8d1a-448c-961f-bf0c58f0bd81" (UID: "5dae8082-8d1a-448c-961f-bf0c58f0bd81"). InnerVolumeSpecName "kube-api-access-ks6k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.860994 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f61da3ae-a72f-4d88-b8cc-38d0503649d8-kube-api-access-tzkx2" (OuterVolumeSpecName: "kube-api-access-tzkx2") pod "f61da3ae-a72f-4d88-b8cc-38d0503649d8" (UID: "f61da3ae-a72f-4d88-b8cc-38d0503649d8"). InnerVolumeSpecName "kube-api-access-tzkx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.861161 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7811a5b-1577-4ecc-b54f-949bc39b0289-kube-api-access-8jtgj" (OuterVolumeSpecName: "kube-api-access-8jtgj") pod "d7811a5b-1577-4ecc-b54f-949bc39b0289" (UID: "d7811a5b-1577-4ecc-b54f-949bc39b0289"). InnerVolumeSpecName "kube-api-access-8jtgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.959616 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.959679 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz5nz\" (UniqueName: \"kubernetes.io/projected/db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5-kube-api-access-fz5nz\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.959705 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzkx2\" (UniqueName: \"kubernetes.io/projected/f61da3ae-a72f-4d88-b8cc-38d0503649d8-kube-api-access-tzkx2\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.959778 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7811a5b-1577-4ecc-b54f-949bc39b0289-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.959810 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jtgj\" (UniqueName: \"kubernetes.io/projected/d7811a5b-1577-4ecc-b54f-949bc39b0289-kube-api-access-8jtgj\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.959834 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks6k8\" (UniqueName: \"kubernetes.io/projected/5dae8082-8d1a-448c-961f-bf0c58f0bd81-kube-api-access-ks6k8\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:09 crc kubenswrapper[4749]: I0320 07:32:09.959857 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blzd7\" (UniqueName: \"kubernetes.io/projected/e9ddc4f6-6f60-489c-bedb-44a31de6894e-kube-api-access-blzd7\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:10 crc kubenswrapper[4749]: I0320 07:32:10.069424 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ds4q2" event={"ID":"d7811a5b-1577-4ecc-b54f-949bc39b0289","Type":"ContainerDied","Data":"252fbb69cf0eabb313b08a486ae2e0f5bf00a781a2ce62da13da71896ac24e13"} Mar 20 07:32:10 crc kubenswrapper[4749]: I0320 07:32:10.069466 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="252fbb69cf0eabb313b08a486ae2e0f5bf00a781a2ce62da13da71896ac24e13" Mar 20 07:32:10 crc kubenswrapper[4749]: I0320 07:32:10.069473 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ds4q2" Mar 20 07:32:10 crc kubenswrapper[4749]: I0320 07:32:10.070846 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vvwnz" event={"ID":"f61da3ae-a72f-4d88-b8cc-38d0503649d8","Type":"ContainerDied","Data":"159e3e318ced0639ac0dfce98f9808b4340ae72da4cc8286aa44eff592af931e"} Mar 20 07:32:10 crc kubenswrapper[4749]: I0320 07:32:10.070897 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="159e3e318ced0639ac0dfce98f9808b4340ae72da4cc8286aa44eff592af931e" Mar 20 07:32:10 crc kubenswrapper[4749]: I0320 07:32:10.070872 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vvwnz" Mar 20 07:32:10 crc kubenswrapper[4749]: I0320 07:32:10.072574 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-563c-account-create-update-xjv9j" Mar 20 07:32:10 crc kubenswrapper[4749]: I0320 07:32:10.072567 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-563c-account-create-update-xjv9j" event={"ID":"e9ddc4f6-6f60-489c-bedb-44a31de6894e","Type":"ContainerDied","Data":"83d9ee994aee160850d6f331b089e199481605431feff27672b91adb5e753fd1"} Mar 20 07:32:10 crc kubenswrapper[4749]: I0320 07:32:10.072696 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83d9ee994aee160850d6f331b089e199481605431feff27672b91adb5e753fd1" Mar 20 07:32:10 crc kubenswrapper[4749]: I0320 07:32:10.074003 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-94fb-account-create-update-qpb8t" Mar 20 07:32:10 crc kubenswrapper[4749]: I0320 07:32:10.074023 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-94fb-account-create-update-qpb8t" event={"ID":"db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5","Type":"ContainerDied","Data":"5bebc714906e2d3744774dafd3bf2208504019418e1d00b10b0aeaf1da72ee8f"} Mar 20 07:32:10 crc kubenswrapper[4749]: I0320 07:32:10.074041 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bebc714906e2d3744774dafd3bf2208504019418e1d00b10b0aeaf1da72ee8f" Mar 20 07:32:10 crc kubenswrapper[4749]: I0320 07:32:10.075810 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6cllh" Mar 20 07:32:10 crc kubenswrapper[4749]: I0320 07:32:10.075804 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6cllh" event={"ID":"f48a0db5-3834-46dd-a959-a6a4e67fc1dd","Type":"ContainerDied","Data":"b53bba30f6c46d8b88b60a4b5286388894120ed0ce4f84e26f2aeb4f609cf7d0"} Mar 20 07:32:10 crc kubenswrapper[4749]: I0320 07:32:10.075861 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b53bba30f6c46d8b88b60a4b5286388894120ed0ce4f84e26f2aeb4f609cf7d0" Mar 20 07:32:10 crc kubenswrapper[4749]: I0320 07:32:10.077364 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b25f-account-create-update-4prvn" event={"ID":"5dae8082-8d1a-448c-961f-bf0c58f0bd81","Type":"ContainerDied","Data":"aedc4733bbee3effd6b731f742b1d33e66c857a25c3acca78bd6cc6d6ee40d5c"} Mar 20 07:32:10 crc kubenswrapper[4749]: I0320 07:32:10.077383 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aedc4733bbee3effd6b731f742b1d33e66c857a25c3acca78bd6cc6d6ee40d5c" Mar 20 07:32:10 crc kubenswrapper[4749]: I0320 07:32:10.077427 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b25f-account-create-update-4prvn" Mar 20 07:32:10 crc kubenswrapper[4749]: I0320 07:32:10.191664 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc41dd13-79db-4f94-a11b-bc0cf369bb76" path="/var/lib/kubelet/pods/dc41dd13-79db-4f94-a11b-bc0cf369bb76/volumes" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.154123 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-gmxsx"] Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.161889 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-gmxsx"] Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.253066 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-fvvnb"] Mar 20 07:32:11 crc kubenswrapper[4749]: E0320 07:32:11.253473 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d08b93fb-b8b6-4ec4-a812-fa127e9519ae" containerName="oc" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.253500 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d08b93fb-b8b6-4ec4-a812-fa127e9519ae" containerName="oc" Mar 20 07:32:11 crc kubenswrapper[4749]: E0320 07:32:11.253525 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dae8082-8d1a-448c-961f-bf0c58f0bd81" containerName="mariadb-account-create-update" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.253535 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dae8082-8d1a-448c-961f-bf0c58f0bd81" containerName="mariadb-account-create-update" Mar 20 07:32:11 crc kubenswrapper[4749]: E0320 07:32:11.253554 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ddc4f6-6f60-489c-bedb-44a31de6894e" containerName="mariadb-account-create-update" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.253564 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ddc4f6-6f60-489c-bedb-44a31de6894e" containerName="mariadb-account-create-update" Mar 20 07:32:11 crc kubenswrapper[4749]: E0320 07:32:11.253577 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f61da3ae-a72f-4d88-b8cc-38d0503649d8" containerName="mariadb-database-create" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.253588 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f61da3ae-a72f-4d88-b8cc-38d0503649d8" containerName="mariadb-database-create" Mar 20 07:32:11 crc kubenswrapper[4749]: E0320 07:32:11.253614 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc41dd13-79db-4f94-a11b-bc0cf369bb76" containerName="init" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.253622 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc41dd13-79db-4f94-a11b-bc0cf369bb76" containerName="init" Mar 20 07:32:11 crc kubenswrapper[4749]: E0320 07:32:11.253636 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc41dd13-79db-4f94-a11b-bc0cf369bb76" containerName="dnsmasq-dns" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.253644 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc41dd13-79db-4f94-a11b-bc0cf369bb76" containerName="dnsmasq-dns" Mar 20 07:32:11 crc kubenswrapper[4749]: E0320 07:32:11.253656 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7811a5b-1577-4ecc-b54f-949bc39b0289" containerName="mariadb-database-create" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.253664 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7811a5b-1577-4ecc-b54f-949bc39b0289" containerName="mariadb-database-create" Mar 20 07:32:11 crc kubenswrapper[4749]: E0320 07:32:11.253674 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="311c3c9f-da96-491a-a2e1-481b567af283" containerName="mariadb-account-create-update" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.253681 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="311c3c9f-da96-491a-a2e1-481b567af283" containerName="mariadb-account-create-update" Mar 20 07:32:11 crc kubenswrapper[4749]: E0320 07:32:11.253690 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5" containerName="mariadb-account-create-update" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.253698 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5" containerName="mariadb-account-create-update" Mar 20 07:32:11 crc kubenswrapper[4749]: E0320 07:32:11.253713 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f48a0db5-3834-46dd-a959-a6a4e67fc1dd" containerName="mariadb-database-create" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.253721 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f48a0db5-3834-46dd-a959-a6a4e67fc1dd" containerName="mariadb-database-create" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.253918 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f61da3ae-a72f-4d88-b8cc-38d0503649d8" containerName="mariadb-database-create" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.253938 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f48a0db5-3834-46dd-a959-a6a4e67fc1dd" containerName="mariadb-database-create" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.253953 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="311c3c9f-da96-491a-a2e1-481b567af283" containerName="mariadb-account-create-update" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.253970 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d08b93fb-b8b6-4ec4-a812-fa127e9519ae" containerName="oc" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.253981 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dae8082-8d1a-448c-961f-bf0c58f0bd81" containerName="mariadb-account-create-update" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.253994 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc41dd13-79db-4f94-a11b-bc0cf369bb76" containerName="dnsmasq-dns" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.254004 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ddc4f6-6f60-489c-bedb-44a31de6894e" containerName="mariadb-account-create-update" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.254014 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7811a5b-1577-4ecc-b54f-949bc39b0289" containerName="mariadb-database-create" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.254024 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5" containerName="mariadb-account-create-update" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.254712 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fvvnb" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.257067 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.266252 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fvvnb"] Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.382801 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d45l\" (UniqueName: \"kubernetes.io/projected/1cccea11-5b61-437f-bddb-888f138a1d3f-kube-api-access-9d45l\") pod \"root-account-create-update-fvvnb\" (UID: \"1cccea11-5b61-437f-bddb-888f138a1d3f\") " pod="openstack/root-account-create-update-fvvnb" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.383198 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cccea11-5b61-437f-bddb-888f138a1d3f-operator-scripts\") pod \"root-account-create-update-fvvnb\" (UID: \"1cccea11-5b61-437f-bddb-888f138a1d3f\") " pod="openstack/root-account-create-update-fvvnb" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.484651 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d45l\" (UniqueName: \"kubernetes.io/projected/1cccea11-5b61-437f-bddb-888f138a1d3f-kube-api-access-9d45l\") pod \"root-account-create-update-fvvnb\" (UID: \"1cccea11-5b61-437f-bddb-888f138a1d3f\") " pod="openstack/root-account-create-update-fvvnb" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.484952 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cccea11-5b61-437f-bddb-888f138a1d3f-operator-scripts\") pod \"root-account-create-update-fvvnb\" (UID: \"1cccea11-5b61-437f-bddb-888f138a1d3f\") " pod="openstack/root-account-create-update-fvvnb" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.486243 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cccea11-5b61-437f-bddb-888f138a1d3f-operator-scripts\") pod \"root-account-create-update-fvvnb\" (UID: \"1cccea11-5b61-437f-bddb-888f138a1d3f\") " pod="openstack/root-account-create-update-fvvnb" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.523533 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d45l\" (UniqueName: \"kubernetes.io/projected/1cccea11-5b61-437f-bddb-888f138a1d3f-kube-api-access-9d45l\") pod \"root-account-create-update-fvvnb\" (UID: \"1cccea11-5b61-437f-bddb-888f138a1d3f\") " pod="openstack/root-account-create-update-fvvnb" Mar 20 07:32:11 crc kubenswrapper[4749]: I0320 07:32:11.570232 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fvvnb" Mar 20 07:32:12 crc kubenswrapper[4749]: I0320 07:32:12.035589 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fvvnb"] Mar 20 07:32:12 crc kubenswrapper[4749]: I0320 07:32:12.103039 4749 generic.go:334] "Generic (PLEG): container finished" podID="8b9b402f-2d95-48f5-98d8-497d90956ba2" containerID="8572c6a9460b80002b347994673a59cd6df57ba39c3cf1dc1f924436191cb2c3" exitCode=0 Mar 20 07:32:12 crc kubenswrapper[4749]: I0320 07:32:12.103106 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerDied","Data":"8572c6a9460b80002b347994673a59cd6df57ba39c3cf1dc1f924436191cb2c3"} Mar 20 07:32:12 crc kubenswrapper[4749]: I0320 07:32:12.107157 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fvvnb" event={"ID":"1cccea11-5b61-437f-bddb-888f138a1d3f","Type":"ContainerStarted","Data":"e880527666098b00ebe45390b9e6e5d9dc1d6ed3adf6bfcd4511c4572c4dadaf"} Mar 20 07:32:12 crc kubenswrapper[4749]: I0320 07:32:12.196192 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="311c3c9f-da96-491a-a2e1-481b567af283" path="/var/lib/kubelet/pods/311c3c9f-da96-491a-a2e1-481b567af283/volumes" Mar 20 07:32:13 crc kubenswrapper[4749]: I0320 07:32:13.120915 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerStarted","Data":"550cc6e3eedc7eeebc5abed9e9349810e24e9b6751499624000a1720500e207b"} Mar 20 07:32:13 crc kubenswrapper[4749]: I0320 07:32:13.121694 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:32:13 crc kubenswrapper[4749]: I0320 07:32:13.124277 4749 generic.go:334] "Generic (PLEG): container finished" podID="1cccea11-5b61-437f-bddb-888f138a1d3f" containerID="d717c0878684addb787c2829681853ed2653c938c3158afe5b2647421d5e6044" exitCode=0 Mar 20 07:32:13 crc kubenswrapper[4749]: I0320 07:32:13.124345 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fvvnb" event={"ID":"1cccea11-5b61-437f-bddb-888f138a1d3f","Type":"ContainerDied","Data":"d717c0878684addb787c2829681853ed2653c938c3158afe5b2647421d5e6044"} Mar 20 07:32:13 crc kubenswrapper[4749]: I0320 07:32:13.163580 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=42.235043805 podStartE2EDuration="54.163556095s" podCreationTimestamp="2026-03-20 07:31:19 +0000 UTC" firstStartedPulling="2026-03-20 07:31:25.111054448 +0000 UTC m=+1121.660712115" lastFinishedPulling="2026-03-20 07:31:37.039566748 +0000 UTC m=+1133.589224405" observedRunningTime="2026-03-20 07:32:13.153715275 +0000 UTC m=+1169.703372942" watchObservedRunningTime="2026-03-20 07:32:13.163556095 +0000 UTC m=+1169.713213782" Mar 20 07:32:13 crc kubenswrapper[4749]: I0320 07:32:13.622932 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8272956e-b31a-4bd8-9118-3ca9721e6d75-etc-swift\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") " pod="openstack/swift-storage-0" Mar 20 07:32:13 crc kubenswrapper[4749]: E0320 07:32:13.623224 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 07:32:13 crc kubenswrapper[4749]: E0320 07:32:13.623261 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 07:32:13 crc kubenswrapper[4749]: E0320 07:32:13.623382 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8272956e-b31a-4bd8-9118-3ca9721e6d75-etc-swift podName:8272956e-b31a-4bd8-9118-3ca9721e6d75 nodeName:}" failed. No retries permitted until 2026-03-20 07:32:29.623352286 +0000 UTC m=+1186.173009973 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8272956e-b31a-4bd8-9118-3ca9721e6d75-etc-swift") pod "swift-storage-0" (UID: "8272956e-b31a-4bd8-9118-3ca9721e6d75") : configmap "swift-ring-files" not found Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.139670 4749 generic.go:334] "Generic (PLEG): container finished" podID="3adcfcfa-0ea4-4c5e-9e57-957538c1469e" containerID="d887667291809460ec9e6fd870e34f27450b46787cbe5148eaf7cc5ab07c1296" exitCode=0 Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.140246 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dgbxq" event={"ID":"3adcfcfa-0ea4-4c5e-9e57-957538c1469e","Type":"ContainerDied","Data":"d887667291809460ec9e6fd870e34f27450b46787cbe5148eaf7cc5ab07c1296"} Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.618579 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fvvnb" Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.622073 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-qm6tz"] Mar 20 07:32:14 crc kubenswrapper[4749]: E0320 07:32:14.622689 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cccea11-5b61-437f-bddb-888f138a1d3f" containerName="mariadb-account-create-update" Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.622727 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cccea11-5b61-437f-bddb-888f138a1d3f" containerName="mariadb-account-create-update" Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.623132 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cccea11-5b61-437f-bddb-888f138a1d3f" containerName="mariadb-account-create-update" Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.624438 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qm6tz" Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.627509 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.628417 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-hpx5h" Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.630497 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-qm6tz"] Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.747086 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cccea11-5b61-437f-bddb-888f138a1d3f-operator-scripts\") pod \"1cccea11-5b61-437f-bddb-888f138a1d3f\" (UID: \"1cccea11-5b61-437f-bddb-888f138a1d3f\") " Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.747137 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d45l\" (UniqueName: \"kubernetes.io/projected/1cccea11-5b61-437f-bddb-888f138a1d3f-kube-api-access-9d45l\") pod \"1cccea11-5b61-437f-bddb-888f138a1d3f\" (UID: \"1cccea11-5b61-437f-bddb-888f138a1d3f\") " Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.747476 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae472d6-49bf-44a4-85a3-30e1dd169d3a-combined-ca-bundle\") pod \"glance-db-sync-qm6tz\" (UID: \"7ae472d6-49bf-44a4-85a3-30e1dd169d3a\") " pod="openstack/glance-db-sync-qm6tz" Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.747504 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbvlw\" (UniqueName: \"kubernetes.io/projected/7ae472d6-49bf-44a4-85a3-30e1dd169d3a-kube-api-access-fbvlw\") pod \"glance-db-sync-qm6tz\" (UID: \"7ae472d6-49bf-44a4-85a3-30e1dd169d3a\") " pod="openstack/glance-db-sync-qm6tz" Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.747548 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae472d6-49bf-44a4-85a3-30e1dd169d3a-config-data\") pod \"glance-db-sync-qm6tz\" (UID: \"7ae472d6-49bf-44a4-85a3-30e1dd169d3a\") " pod="openstack/glance-db-sync-qm6tz" Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.747695 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cccea11-5b61-437f-bddb-888f138a1d3f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1cccea11-5b61-437f-bddb-888f138a1d3f" (UID: "1cccea11-5b61-437f-bddb-888f138a1d3f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.747791 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ae472d6-49bf-44a4-85a3-30e1dd169d3a-db-sync-config-data\") pod \"glance-db-sync-qm6tz\" (UID: \"7ae472d6-49bf-44a4-85a3-30e1dd169d3a\") " pod="openstack/glance-db-sync-qm6tz" Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.747987 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cccea11-5b61-437f-bddb-888f138a1d3f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.755061 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cccea11-5b61-437f-bddb-888f138a1d3f-kube-api-access-9d45l" (OuterVolumeSpecName: "kube-api-access-9d45l") pod "1cccea11-5b61-437f-bddb-888f138a1d3f" (UID: "1cccea11-5b61-437f-bddb-888f138a1d3f"). InnerVolumeSpecName "kube-api-access-9d45l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.849378 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ae472d6-49bf-44a4-85a3-30e1dd169d3a-db-sync-config-data\") pod \"glance-db-sync-qm6tz\" (UID: \"7ae472d6-49bf-44a4-85a3-30e1dd169d3a\") " pod="openstack/glance-db-sync-qm6tz" Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.849581 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae472d6-49bf-44a4-85a3-30e1dd169d3a-combined-ca-bundle\") pod \"glance-db-sync-qm6tz\" (UID: \"7ae472d6-49bf-44a4-85a3-30e1dd169d3a\") " pod="openstack/glance-db-sync-qm6tz" Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.849618 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbvlw\" (UniqueName: \"kubernetes.io/projected/7ae472d6-49bf-44a4-85a3-30e1dd169d3a-kube-api-access-fbvlw\") pod \"glance-db-sync-qm6tz\" (UID: \"7ae472d6-49bf-44a4-85a3-30e1dd169d3a\") " pod="openstack/glance-db-sync-qm6tz" Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.849712 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae472d6-49bf-44a4-85a3-30e1dd169d3a-config-data\") pod \"glance-db-sync-qm6tz\" (UID: \"7ae472d6-49bf-44a4-85a3-30e1dd169d3a\") " pod="openstack/glance-db-sync-qm6tz" Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.849826 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d45l\" (UniqueName: \"kubernetes.io/projected/1cccea11-5b61-437f-bddb-888f138a1d3f-kube-api-access-9d45l\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.854414 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ae472d6-49bf-44a4-85a3-30e1dd169d3a-db-sync-config-data\") pod \"glance-db-sync-qm6tz\" (UID: \"7ae472d6-49bf-44a4-85a3-30e1dd169d3a\") " pod="openstack/glance-db-sync-qm6tz" Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.856545 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae472d6-49bf-44a4-85a3-30e1dd169d3a-config-data\") pod \"glance-db-sync-qm6tz\" (UID: \"7ae472d6-49bf-44a4-85a3-30e1dd169d3a\") " pod="openstack/glance-db-sync-qm6tz" Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.856995 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae472d6-49bf-44a4-85a3-30e1dd169d3a-combined-ca-bundle\") pod \"glance-db-sync-qm6tz\" (UID: \"7ae472d6-49bf-44a4-85a3-30e1dd169d3a\") " pod="openstack/glance-db-sync-qm6tz" Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.869307 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbvlw\" (UniqueName: \"kubernetes.io/projected/7ae472d6-49bf-44a4-85a3-30e1dd169d3a-kube-api-access-fbvlw\") pod \"glance-db-sync-qm6tz\" (UID: \"7ae472d6-49bf-44a4-85a3-30e1dd169d3a\") " pod="openstack/glance-db-sync-qm6tz" Mar 20 07:32:14 crc kubenswrapper[4749]: I0320 07:32:14.945403 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qm6tz" Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.149098 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fvvnb" Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.152783 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fvvnb" event={"ID":"1cccea11-5b61-437f-bddb-888f138a1d3f","Type":"ContainerDied","Data":"e880527666098b00ebe45390b9e6e5d9dc1d6ed3adf6bfcd4511c4572c4dadaf"} Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.152844 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e880527666098b00ebe45390b9e6e5d9dc1d6ed3adf6bfcd4511c4572c4dadaf" Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.435750 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.460609 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5g454\" (UniqueName: \"kubernetes.io/projected/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-kube-api-access-5g454\") pod \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.460667 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-ring-data-devices\") pod \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.460737 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-dispersionconf\") pod \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.460796 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-combined-ca-bundle\") pod \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.460849 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-scripts\") pod \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.460929 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-etc-swift\") pod \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.460980 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-swiftconf\") pod \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\" (UID: \"3adcfcfa-0ea4-4c5e-9e57-957538c1469e\") " Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.463011 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3adcfcfa-0ea4-4c5e-9e57-957538c1469e" (UID: "3adcfcfa-0ea4-4c5e-9e57-957538c1469e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.464509 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3adcfcfa-0ea4-4c5e-9e57-957538c1469e" (UID: "3adcfcfa-0ea4-4c5e-9e57-957538c1469e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.466540 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-kube-api-access-5g454" (OuterVolumeSpecName: "kube-api-access-5g454") pod "3adcfcfa-0ea4-4c5e-9e57-957538c1469e" (UID: "3adcfcfa-0ea4-4c5e-9e57-957538c1469e"). InnerVolumeSpecName "kube-api-access-5g454". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.468581 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3adcfcfa-0ea4-4c5e-9e57-957538c1469e" (UID: "3adcfcfa-0ea4-4c5e-9e57-957538c1469e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.492438 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3adcfcfa-0ea4-4c5e-9e57-957538c1469e" (UID: "3adcfcfa-0ea4-4c5e-9e57-957538c1469e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.502339 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3adcfcfa-0ea4-4c5e-9e57-957538c1469e" (UID: "3adcfcfa-0ea4-4c5e-9e57-957538c1469e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.504349 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-scripts" (OuterVolumeSpecName: "scripts") pod "3adcfcfa-0ea4-4c5e-9e57-957538c1469e" (UID: "3adcfcfa-0ea4-4c5e-9e57-957538c1469e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:32:15 crc kubenswrapper[4749]: W0320 07:32:15.509923 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ae472d6_49bf_44a4_85a3_30e1dd169d3a.slice/crio-bbe8324f1293a150413b3046bba42bab693dce7514988586e6f5b40eeae33630 WatchSource:0}: Error finding container bbe8324f1293a150413b3046bba42bab693dce7514988586e6f5b40eeae33630: Status 404 returned error can't find the container with id bbe8324f1293a150413b3046bba42bab693dce7514988586e6f5b40eeae33630 Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.512435 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.521734 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-qm6tz"] Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.563206 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.563242 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.563250 4749 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.563259 4749 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.563268 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5g454\" (UniqueName: \"kubernetes.io/projected/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-kube-api-access-5g454\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.563291 4749 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.563302 4749 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3adcfcfa-0ea4-4c5e-9e57-957538c1469e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:15 crc kubenswrapper[4749]: I0320 07:32:15.772666 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 20 07:32:16 crc kubenswrapper[4749]: I0320 07:32:16.163039 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qm6tz" event={"ID":"7ae472d6-49bf-44a4-85a3-30e1dd169d3a","Type":"ContainerStarted","Data":"bbe8324f1293a150413b3046bba42bab693dce7514988586e6f5b40eeae33630"} Mar 20 07:32:16 crc kubenswrapper[4749]: I0320 07:32:16.165693 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dgbxq" event={"ID":"3adcfcfa-0ea4-4c5e-9e57-957538c1469e","Type":"ContainerDied","Data":"0e90ef62cbc5c8075e53d48cf7cb1c8b5d794d84f6257afbd684759e0231829c"} Mar 20 07:32:16 crc kubenswrapper[4749]: I0320 07:32:16.165719 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e90ef62cbc5c8075e53d48cf7cb1c8b5d794d84f6257afbd684759e0231829c" Mar 20 07:32:16 crc kubenswrapper[4749]: I0320 07:32:16.165730 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dgbxq" Mar 20 07:32:16 crc kubenswrapper[4749]: I0320 07:32:16.168101 4749 generic.go:334] "Generic (PLEG): container finished" podID="8b9b402f-2d95-48f5-98d8-497d90956ba2" containerID="550cc6e3eedc7eeebc5abed9e9349810e24e9b6751499624000a1720500e207b" exitCode=0 Mar 20 07:32:16 crc kubenswrapper[4749]: I0320 07:32:16.168163 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerDied","Data":"550cc6e3eedc7eeebc5abed9e9349810e24e9b6751499624000a1720500e207b"} Mar 20 07:32:16 crc kubenswrapper[4749]: I0320 07:32:16.169159 4749 scope.go:117] "RemoveContainer" containerID="550cc6e3eedc7eeebc5abed9e9349810e24e9b6751499624000a1720500e207b" Mar 20 07:32:17 crc kubenswrapper[4749]: I0320 07:32:17.180987 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerStarted","Data":"0763b178530122b3e8e381a52b72b60d0103b38188e175f70597899afb88e2da"} Mar 20 07:32:17 crc kubenswrapper[4749]: I0320 07:32:17.181253 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:32:19 crc kubenswrapper[4749]: I0320 07:32:19.682700 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-tx9bw" podUID="32ceaa95-18d9-4f1e-9ebd-f2d413709413" containerName="ovn-controller" probeResult="failure" output=< Mar 20 07:32:19 crc kubenswrapper[4749]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 07:32:19 crc kubenswrapper[4749]: > Mar 20 07:32:19 crc kubenswrapper[4749]: I0320 07:32:19.721638 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-kvqdd" Mar 20 07:32:19 crc kubenswrapper[4749]: I0320 07:32:19.725392 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-kvqdd" Mar 20 07:32:19 crc kubenswrapper[4749]: I0320 07:32:19.958127 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-tx9bw-config-cbhvp"] Mar 20 07:32:19 crc kubenswrapper[4749]: E0320 07:32:19.958627 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3adcfcfa-0ea4-4c5e-9e57-957538c1469e" containerName="swift-ring-rebalance" Mar 20 07:32:19 crc kubenswrapper[4749]: I0320 07:32:19.958651 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3adcfcfa-0ea4-4c5e-9e57-957538c1469e" containerName="swift-ring-rebalance" Mar 20 07:32:19 crc kubenswrapper[4749]: I0320 07:32:19.958907 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3adcfcfa-0ea4-4c5e-9e57-957538c1469e" containerName="swift-ring-rebalance" Mar 20 07:32:19 crc kubenswrapper[4749]: I0320 07:32:19.960975 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tx9bw-config-cbhvp" Mar 20 07:32:19 crc kubenswrapper[4749]: I0320 07:32:19.964562 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 07:32:19 crc kubenswrapper[4749]: I0320 07:32:19.970919 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-tx9bw-config-cbhvp"] Mar 20 07:32:20 crc kubenswrapper[4749]: I0320 07:32:20.066943 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/83e26e83-f14d-4346-a537-332070445891-var-log-ovn\") pod \"ovn-controller-tx9bw-config-cbhvp\" (UID: \"83e26e83-f14d-4346-a537-332070445891\") " pod="openstack/ovn-controller-tx9bw-config-cbhvp" Mar 20 07:32:20 crc kubenswrapper[4749]: I0320 07:32:20.067034 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83e26e83-f14d-4346-a537-332070445891-scripts\") pod \"ovn-controller-tx9bw-config-cbhvp\" (UID: \"83e26e83-f14d-4346-a537-332070445891\") " pod="openstack/ovn-controller-tx9bw-config-cbhvp" Mar 20 07:32:20 crc kubenswrapper[4749]: I0320 07:32:20.067094 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/83e26e83-f14d-4346-a537-332070445891-var-run\") pod \"ovn-controller-tx9bw-config-cbhvp\" (UID: \"83e26e83-f14d-4346-a537-332070445891\") " pod="openstack/ovn-controller-tx9bw-config-cbhvp" Mar 20 07:32:20 crc kubenswrapper[4749]: I0320 07:32:20.067154 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9b97\" (UniqueName: \"kubernetes.io/projected/83e26e83-f14d-4346-a537-332070445891-kube-api-access-s9b97\") pod \"ovn-controller-tx9bw-config-cbhvp\" (UID: \"83e26e83-f14d-4346-a537-332070445891\") " pod="openstack/ovn-controller-tx9bw-config-cbhvp" Mar 20 07:32:20 crc kubenswrapper[4749]: I0320 07:32:20.067202 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/83e26e83-f14d-4346-a537-332070445891-additional-scripts\") pod \"ovn-controller-tx9bw-config-cbhvp\" (UID: \"83e26e83-f14d-4346-a537-332070445891\") " pod="openstack/ovn-controller-tx9bw-config-cbhvp" Mar 20 07:32:20 crc kubenswrapper[4749]: I0320 07:32:20.067231 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/83e26e83-f14d-4346-a537-332070445891-var-run-ovn\") pod \"ovn-controller-tx9bw-config-cbhvp\" (UID: \"83e26e83-f14d-4346-a537-332070445891\") " pod="openstack/ovn-controller-tx9bw-config-cbhvp" Mar 20 07:32:20 crc kubenswrapper[4749]: I0320 07:32:20.168759 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/83e26e83-f14d-4346-a537-332070445891-var-run\") pod \"ovn-controller-tx9bw-config-cbhvp\" (UID: \"83e26e83-f14d-4346-a537-332070445891\") " pod="openstack/ovn-controller-tx9bw-config-cbhvp" Mar 20 07:32:20 crc kubenswrapper[4749]: I0320 07:32:20.168825 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9b97\" (UniqueName: \"kubernetes.io/projected/83e26e83-f14d-4346-a537-332070445891-kube-api-access-s9b97\") pod \"ovn-controller-tx9bw-config-cbhvp\" (UID: \"83e26e83-f14d-4346-a537-332070445891\") " pod="openstack/ovn-controller-tx9bw-config-cbhvp" Mar 20 07:32:20 crc kubenswrapper[4749]: I0320 07:32:20.168845 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/83e26e83-f14d-4346-a537-332070445891-additional-scripts\") pod \"ovn-controller-tx9bw-config-cbhvp\" (UID: \"83e26e83-f14d-4346-a537-332070445891\") " pod="openstack/ovn-controller-tx9bw-config-cbhvp" Mar 20 07:32:20 crc kubenswrapper[4749]: I0320 07:32:20.168866 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/83e26e83-f14d-4346-a537-332070445891-var-run-ovn\") pod \"ovn-controller-tx9bw-config-cbhvp\" (UID: \"83e26e83-f14d-4346-a537-332070445891\") " pod="openstack/ovn-controller-tx9bw-config-cbhvp" Mar 20 07:32:20 crc kubenswrapper[4749]: I0320 07:32:20.168946 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/83e26e83-f14d-4346-a537-332070445891-var-log-ovn\") pod \"ovn-controller-tx9bw-config-cbhvp\" (UID: \"83e26e83-f14d-4346-a537-332070445891\") " pod="openstack/ovn-controller-tx9bw-config-cbhvp" Mar 20 07:32:20 crc kubenswrapper[4749]: I0320 07:32:20.168963 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83e26e83-f14d-4346-a537-332070445891-scripts\") pod \"ovn-controller-tx9bw-config-cbhvp\" (UID: \"83e26e83-f14d-4346-a537-332070445891\") " pod="openstack/ovn-controller-tx9bw-config-cbhvp" Mar 20 07:32:20 crc kubenswrapper[4749]: I0320 07:32:20.169455 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/83e26e83-f14d-4346-a537-332070445891-var-run-ovn\") pod \"ovn-controller-tx9bw-config-cbhvp\" (UID: \"83e26e83-f14d-4346-a537-332070445891\") " pod="openstack/ovn-controller-tx9bw-config-cbhvp" Mar 20 07:32:20 crc kubenswrapper[4749]: I0320 07:32:20.169738 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/83e26e83-f14d-4346-a537-332070445891-var-log-ovn\") pod \"ovn-controller-tx9bw-config-cbhvp\" (UID: \"83e26e83-f14d-4346-a537-332070445891\") " pod="openstack/ovn-controller-tx9bw-config-cbhvp" Mar 20 07:32:20 crc kubenswrapper[4749]: I0320 07:32:20.170158 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/83e26e83-f14d-4346-a537-332070445891-additional-scripts\") pod \"ovn-controller-tx9bw-config-cbhvp\" (UID: \"83e26e83-f14d-4346-a537-332070445891\") " pod="openstack/ovn-controller-tx9bw-config-cbhvp" Mar 20 07:32:20 crc kubenswrapper[4749]: I0320 07:32:20.170320 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/83e26e83-f14d-4346-a537-332070445891-var-run\") pod \"ovn-controller-tx9bw-config-cbhvp\" (UID: \"83e26e83-f14d-4346-a537-332070445891\") " pod="openstack/ovn-controller-tx9bw-config-cbhvp" Mar 20 07:32:20 crc kubenswrapper[4749]: I0320 07:32:20.170957 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83e26e83-f14d-4346-a537-332070445891-scripts\") pod \"ovn-controller-tx9bw-config-cbhvp\" (UID: \"83e26e83-f14d-4346-a537-332070445891\") " pod="openstack/ovn-controller-tx9bw-config-cbhvp" Mar 20 07:32:20 crc kubenswrapper[4749]: I0320 07:32:20.189702 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9b97\" (UniqueName: \"kubernetes.io/projected/83e26e83-f14d-4346-a537-332070445891-kube-api-access-s9b97\") pod \"ovn-controller-tx9bw-config-cbhvp\" (UID: \"83e26e83-f14d-4346-a537-332070445891\") " pod="openstack/ovn-controller-tx9bw-config-cbhvp" Mar 20 07:32:20 crc kubenswrapper[4749]: I0320 07:32:20.300942 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tx9bw-config-cbhvp" Mar 20 07:32:21 crc kubenswrapper[4749]: I0320 07:32:21.213831 4749 generic.go:334] "Generic (PLEG): container finished" podID="8b9b402f-2d95-48f5-98d8-497d90956ba2" containerID="0763b178530122b3e8e381a52b72b60d0103b38188e175f70597899afb88e2da" exitCode=0 Mar 20 07:32:21 crc kubenswrapper[4749]: I0320 07:32:21.213908 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerDied","Data":"0763b178530122b3e8e381a52b72b60d0103b38188e175f70597899afb88e2da"} Mar 20 07:32:21 crc kubenswrapper[4749]: I0320 07:32:21.214154 4749 scope.go:117] "RemoveContainer" containerID="550cc6e3eedc7eeebc5abed9e9349810e24e9b6751499624000a1720500e207b" Mar 20 07:32:21 crc kubenswrapper[4749]: I0320 07:32:21.214814 4749 scope.go:117] "RemoveContainer" containerID="0763b178530122b3e8e381a52b72b60d0103b38188e175f70597899afb88e2da" Mar 20 07:32:21 crc kubenswrapper[4749]: E0320 07:32:21.215088 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 10s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:32:24 crc kubenswrapper[4749]: I0320 07:32:24.683343 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-tx9bw" podUID="32ceaa95-18d9-4f1e-9ebd-f2d413709413" containerName="ovn-controller" probeResult="failure" output=< Mar 20 07:32:24 crc kubenswrapper[4749]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 07:32:24 crc kubenswrapper[4749]: > Mar 20 07:32:25 crc kubenswrapper[4749]: I0320 07:32:25.249032 4749 generic.go:334] "Generic (PLEG): container finished" podID="8db06e36-0b00-4157-9345-69449da3e85f" containerID="30cf25cee069fd79718872b52ff67190111f7e963a3d6bd02d0024f6aff141bb" exitCode=0 Mar 20 07:32:25 crc kubenswrapper[4749]: I0320 07:32:25.249092 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerDied","Data":"30cf25cee069fd79718872b52ff67190111f7e963a3d6bd02d0024f6aff141bb"} Mar 20 07:32:26 crc kubenswrapper[4749]: I0320 07:32:26.898374 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-tx9bw-config-cbhvp"] Mar 20 07:32:26 crc kubenswrapper[4749]: W0320 07:32:26.904334 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83e26e83_f14d_4346_a537_332070445891.slice/crio-976a0a31d1b940993c9bcca6d166a940fae251538f6a49a934cace457d8a62db WatchSource:0}: Error finding container 976a0a31d1b940993c9bcca6d166a940fae251538f6a49a934cace457d8a62db: Status 404 returned error can't find the container with id 976a0a31d1b940993c9bcca6d166a940fae251538f6a49a934cace457d8a62db Mar 20 07:32:27 crc kubenswrapper[4749]: I0320 07:32:27.285514 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerStarted","Data":"191a7ad9cc4bb6e1435c3a2ebbeae3bf07a7c9404bfba46f2c7d6deda1b286d4"} Mar 20 07:32:27 crc kubenswrapper[4749]: I0320 07:32:27.286127 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 07:32:27 crc kubenswrapper[4749]: I0320 07:32:27.291632 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qm6tz" event={"ID":"7ae472d6-49bf-44a4-85a3-30e1dd169d3a","Type":"ContainerStarted","Data":"49b4f5449d47da38b079ed5043f4527a09a72ed466b1188f24387c92f34f2255"} Mar 20 07:32:27 crc kubenswrapper[4749]: I0320 07:32:27.294308 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-tx9bw-config-cbhvp" event={"ID":"83e26e83-f14d-4346-a537-332070445891","Type":"ContainerStarted","Data":"6436937fc1409f5452958dcc9ea51d79e32e66f9d4790788ef19b4e069433472"} Mar 20 07:32:27 crc kubenswrapper[4749]: I0320 07:32:27.294340 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-tx9bw-config-cbhvp" event={"ID":"83e26e83-f14d-4346-a537-332070445891","Type":"ContainerStarted","Data":"976a0a31d1b940993c9bcca6d166a940fae251538f6a49a934cace457d8a62db"} Mar 20 07:32:27 crc kubenswrapper[4749]: I0320 07:32:27.327369 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371968.527424 podStartE2EDuration="1m8.327352292s" podCreationTimestamp="2026-03-20 07:31:19 +0000 UTC" firstStartedPulling="2026-03-20 07:31:20.964806273 +0000 UTC m=+1117.514463920" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:32:27.309886769 +0000 UTC m=+1183.859544456" watchObservedRunningTime="2026-03-20 07:32:27.327352292 +0000 UTC m=+1183.877009939" Mar 20 07:32:27 crc kubenswrapper[4749]: I0320 07:32:27.331233 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-qm6tz" podStartSLOduration=2.343785238 podStartE2EDuration="13.331216476s" podCreationTimestamp="2026-03-20 07:32:14 +0000 UTC" firstStartedPulling="2026-03-20 07:32:15.512136911 +0000 UTC m=+1172.061794558" lastFinishedPulling="2026-03-20 07:32:26.499568129 +0000 UTC m=+1183.049225796" observedRunningTime="2026-03-20 07:32:27.326536972 +0000 UTC m=+1183.876194619" watchObservedRunningTime="2026-03-20 07:32:27.331216476 +0000 UTC m=+1183.880874123" Mar 20 07:32:27 crc kubenswrapper[4749]: I0320 07:32:27.352876 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-tx9bw-config-cbhvp" podStartSLOduration=8.352862849 podStartE2EDuration="8.352862849s" podCreationTimestamp="2026-03-20 07:32:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:32:27.351987057 +0000 UTC m=+1183.901644704" watchObservedRunningTime="2026-03-20 07:32:27.352862849 +0000 UTC m=+1183.902520496" Mar 20 07:32:28 crc kubenswrapper[4749]: I0320 07:32:28.309116 4749 generic.go:334] "Generic (PLEG): container finished" podID="83e26e83-f14d-4346-a537-332070445891" containerID="6436937fc1409f5452958dcc9ea51d79e32e66f9d4790788ef19b4e069433472" exitCode=0 Mar 20 07:32:28 crc kubenswrapper[4749]: I0320 07:32:28.309313 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-tx9bw-config-cbhvp" event={"ID":"83e26e83-f14d-4346-a537-332070445891","Type":"ContainerDied","Data":"6436937fc1409f5452958dcc9ea51d79e32e66f9d4790788ef19b4e069433472"} Mar 20 07:32:29 crc kubenswrapper[4749]: I0320 07:32:29.652431 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8272956e-b31a-4bd8-9118-3ca9721e6d75-etc-swift\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") " pod="openstack/swift-storage-0" Mar 20 07:32:29 crc kubenswrapper[4749]: I0320 07:32:29.660307 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8272956e-b31a-4bd8-9118-3ca9721e6d75-etc-swift\") pod \"swift-storage-0\" (UID: \"8272956e-b31a-4bd8-9118-3ca9721e6d75\") " pod="openstack/swift-storage-0" Mar 20 07:32:29 crc kubenswrapper[4749]: I0320 07:32:29.682735 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-tx9bw" Mar 20 07:32:29 crc kubenswrapper[4749]: I0320 07:32:29.715818 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tx9bw-config-cbhvp" Mar 20 07:32:29 crc kubenswrapper[4749]: I0320 07:32:29.754150 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/83e26e83-f14d-4346-a537-332070445891-additional-scripts\") pod \"83e26e83-f14d-4346-a537-332070445891\" (UID: \"83e26e83-f14d-4346-a537-332070445891\") " Mar 20 07:32:29 crc kubenswrapper[4749]: I0320 07:32:29.754231 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/83e26e83-f14d-4346-a537-332070445891-var-log-ovn\") pod \"83e26e83-f14d-4346-a537-332070445891\" (UID: \"83e26e83-f14d-4346-a537-332070445891\") " Mar 20 07:32:29 crc kubenswrapper[4749]: I0320 07:32:29.754374 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83e26e83-f14d-4346-a537-332070445891-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "83e26e83-f14d-4346-a537-332070445891" (UID: "83e26e83-f14d-4346-a537-332070445891"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:32:29 crc kubenswrapper[4749]: I0320 07:32:29.754397 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/83e26e83-f14d-4346-a537-332070445891-var-run-ovn\") pod \"83e26e83-f14d-4346-a537-332070445891\" (UID: \"83e26e83-f14d-4346-a537-332070445891\") " Mar 20 07:32:29 crc kubenswrapper[4749]: I0320 07:32:29.754432 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83e26e83-f14d-4346-a537-332070445891-scripts\") pod \"83e26e83-f14d-4346-a537-332070445891\" (UID: \"83e26e83-f14d-4346-a537-332070445891\") " Mar 20 07:32:29 crc kubenswrapper[4749]: I0320 07:32:29.754463 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9b97\" (UniqueName: \"kubernetes.io/projected/83e26e83-f14d-4346-a537-332070445891-kube-api-access-s9b97\") pod \"83e26e83-f14d-4346-a537-332070445891\" (UID: \"83e26e83-f14d-4346-a537-332070445891\") " Mar 20 07:32:29 crc kubenswrapper[4749]: I0320 07:32:29.754492 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83e26e83-f14d-4346-a537-332070445891-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "83e26e83-f14d-4346-a537-332070445891" (UID: "83e26e83-f14d-4346-a537-332070445891"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:32:29 crc kubenswrapper[4749]: I0320 07:32:29.754567 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/83e26e83-f14d-4346-a537-332070445891-var-run\") pod \"83e26e83-f14d-4346-a537-332070445891\" (UID: \"83e26e83-f14d-4346-a537-332070445891\") " Mar 20 07:32:29 crc kubenswrapper[4749]: I0320 07:32:29.754666 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83e26e83-f14d-4346-a537-332070445891-var-run" (OuterVolumeSpecName: "var-run") pod "83e26e83-f14d-4346-a537-332070445891" (UID: "83e26e83-f14d-4346-a537-332070445891"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 07:32:29 crc kubenswrapper[4749]: I0320 07:32:29.754990 4749 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/83e26e83-f14d-4346-a537-332070445891-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:29 crc kubenswrapper[4749]: I0320 07:32:29.755014 4749 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/83e26e83-f14d-4346-a537-332070445891-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:29 crc kubenswrapper[4749]: I0320 07:32:29.755028 4749 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/83e26e83-f14d-4346-a537-332070445891-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:29 crc kubenswrapper[4749]: I0320 07:32:29.756060 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83e26e83-f14d-4346-a537-332070445891-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "83e26e83-f14d-4346-a537-332070445891" (UID: "83e26e83-f14d-4346-a537-332070445891"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:32:29 crc kubenswrapper[4749]: I0320 07:32:29.756238 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83e26e83-f14d-4346-a537-332070445891-scripts" (OuterVolumeSpecName: "scripts") pod "83e26e83-f14d-4346-a537-332070445891" (UID: "83e26e83-f14d-4346-a537-332070445891"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:32:29 crc kubenswrapper[4749]: I0320 07:32:29.767252 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83e26e83-f14d-4346-a537-332070445891-kube-api-access-s9b97" (OuterVolumeSpecName: "kube-api-access-s9b97") pod "83e26e83-f14d-4346-a537-332070445891" (UID: "83e26e83-f14d-4346-a537-332070445891"). InnerVolumeSpecName "kube-api-access-s9b97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:32:29 crc kubenswrapper[4749]: I0320 07:32:29.832725 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 07:32:29 crc kubenswrapper[4749]: I0320 07:32:29.856762 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83e26e83-f14d-4346-a537-332070445891-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:29 crc kubenswrapper[4749]: I0320 07:32:29.856806 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9b97\" (UniqueName: \"kubernetes.io/projected/83e26e83-f14d-4346-a537-332070445891-kube-api-access-s9b97\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:29 crc kubenswrapper[4749]: I0320 07:32:29.856820 4749 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/83e26e83-f14d-4346-a537-332070445891-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:30 crc kubenswrapper[4749]: I0320 07:32:30.004679 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-tx9bw-config-cbhvp"] Mar 20 07:32:30 crc kubenswrapper[4749]: I0320 07:32:30.020377 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-tx9bw-config-cbhvp"] Mar 20 07:32:30 crc kubenswrapper[4749]: I0320 07:32:30.130295 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 07:32:30 crc kubenswrapper[4749]: W0320 07:32:30.135626 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8272956e_b31a_4bd8_9118_3ca9721e6d75.slice/crio-152c475ebf2f6959a49c17120f5aa68c145f04807a6653eebc80c67368b7174d WatchSource:0}: Error finding container 152c475ebf2f6959a49c17120f5aa68c145f04807a6653eebc80c67368b7174d: Status 404 returned error can't find the container with id 152c475ebf2f6959a49c17120f5aa68c145f04807a6653eebc80c67368b7174d Mar 20 07:32:30 crc kubenswrapper[4749]: I0320 07:32:30.188759 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83e26e83-f14d-4346-a537-332070445891" path="/var/lib/kubelet/pods/83e26e83-f14d-4346-a537-332070445891/volumes" Mar 20 07:32:30 crc kubenswrapper[4749]: I0320 07:32:30.328882 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8272956e-b31a-4bd8-9118-3ca9721e6d75","Type":"ContainerStarted","Data":"152c475ebf2f6959a49c17120f5aa68c145f04807a6653eebc80c67368b7174d"} Mar 20 07:32:30 crc kubenswrapper[4749]: I0320 07:32:30.330617 4749 scope.go:117] "RemoveContainer" containerID="6436937fc1409f5452958dcc9ea51d79e32e66f9d4790788ef19b4e069433472" Mar 20 07:32:30 crc kubenswrapper[4749]: I0320 07:32:30.330706 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tx9bw-config-cbhvp" Mar 20 07:32:31 crc kubenswrapper[4749]: I0320 07:32:31.367353 4749 generic.go:334] "Generic (PLEG): container finished" podID="8db06e36-0b00-4157-9345-69449da3e85f" containerID="191a7ad9cc4bb6e1435c3a2ebbeae3bf07a7c9404bfba46f2c7d6deda1b286d4" exitCode=0 Mar 20 07:32:31 crc kubenswrapper[4749]: I0320 07:32:31.367425 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerDied","Data":"191a7ad9cc4bb6e1435c3a2ebbeae3bf07a7c9404bfba46f2c7d6deda1b286d4"} Mar 20 07:32:31 crc kubenswrapper[4749]: I0320 07:32:31.369186 4749 scope.go:117] "RemoveContainer" containerID="191a7ad9cc4bb6e1435c3a2ebbeae3bf07a7c9404bfba46f2c7d6deda1b286d4" Mar 20 07:32:32 crc kubenswrapper[4749]: I0320 07:32:32.382198 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerStarted","Data":"33bc94bce903b7afb8d3950455538835c2def1369c4cf0f1b11ec4712f53f659"} Mar 20 07:32:32 crc kubenswrapper[4749]: I0320 07:32:32.384144 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 07:32:32 crc kubenswrapper[4749]: I0320 07:32:32.387488 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8272956e-b31a-4bd8-9118-3ca9721e6d75","Type":"ContainerStarted","Data":"a15115ee2459f97060a010766e82ff311a283b6b9cc19655d7c21d3ad15224f1"} Mar 20 07:32:32 crc kubenswrapper[4749]: I0320 07:32:32.387540 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8272956e-b31a-4bd8-9118-3ca9721e6d75","Type":"ContainerStarted","Data":"82f87632d51765670c7fd4fdb627f996db6cba87fee9c6418c55349fbc2d8ae3"} Mar 20 07:32:32 crc kubenswrapper[4749]: I0320 07:32:32.387556 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8272956e-b31a-4bd8-9118-3ca9721e6d75","Type":"ContainerStarted","Data":"14c493e31287fe1ed7109f204ee05673576bda5365752772999602d21ec5b55c"} Mar 20 07:32:32 crc kubenswrapper[4749]: I0320 07:32:32.387568 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8272956e-b31a-4bd8-9118-3ca9721e6d75","Type":"ContainerStarted","Data":"a72389f093232fac519e4a936f2a8e6567bfa6c75b9fc3b149231eecf1171806"} Mar 20 07:32:33 crc kubenswrapper[4749]: I0320 07:32:33.403528 4749 generic.go:334] "Generic (PLEG): container finished" podID="7ae472d6-49bf-44a4-85a3-30e1dd169d3a" containerID="49b4f5449d47da38b079ed5043f4527a09a72ed466b1188f24387c92f34f2255" exitCode=0 Mar 20 07:32:33 crc kubenswrapper[4749]: I0320 07:32:33.404074 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qm6tz" event={"ID":"7ae472d6-49bf-44a4-85a3-30e1dd169d3a","Type":"ContainerDied","Data":"49b4f5449d47da38b079ed5043f4527a09a72ed466b1188f24387c92f34f2255"} Mar 20 07:32:34 crc kubenswrapper[4749]: I0320 07:32:34.422101 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8272956e-b31a-4bd8-9118-3ca9721e6d75","Type":"ContainerStarted","Data":"4c5edf5e432f9c5eb3ad78a45c37b2d2d394e173af3e53c257173c403373eda3"} Mar 20 07:32:34 crc kubenswrapper[4749]: I0320 07:32:34.422602 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8272956e-b31a-4bd8-9118-3ca9721e6d75","Type":"ContainerStarted","Data":"3270599347c29a1aa49e49f68dc9035e580beb39dd574fd819b2be70e43e811e"} Mar 20 07:32:34 crc kubenswrapper[4749]: I0320 07:32:34.422639 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8272956e-b31a-4bd8-9118-3ca9721e6d75","Type":"ContainerStarted","Data":"d0a1bbbaaeb1726b535a3a725629a63398af01ccec9bad7d1a914ace6feeb88d"} Mar 20 07:32:34 crc kubenswrapper[4749]: I0320 07:32:34.514908 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:32:34 crc kubenswrapper[4749]: I0320 07:32:34.514988 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:32:34 crc kubenswrapper[4749]: I0320 07:32:34.515047 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:32:34 crc kubenswrapper[4749]: I0320 07:32:34.515842 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"72a24d5f0786b3da9aac01d553c981fdcf13ebc1b2358317a489547c93d570db"} pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 07:32:34 crc kubenswrapper[4749]: I0320 07:32:34.515910 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" containerID="cri-o://72a24d5f0786b3da9aac01d553c981fdcf13ebc1b2358317a489547c93d570db" gracePeriod=600 Mar 20 07:32:34 crc kubenswrapper[4749]: I0320 07:32:34.992677 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qm6tz" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.071119 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ae472d6-49bf-44a4-85a3-30e1dd169d3a-db-sync-config-data\") pod \"7ae472d6-49bf-44a4-85a3-30e1dd169d3a\" (UID: \"7ae472d6-49bf-44a4-85a3-30e1dd169d3a\") " Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.071929 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae472d6-49bf-44a4-85a3-30e1dd169d3a-config-data\") pod \"7ae472d6-49bf-44a4-85a3-30e1dd169d3a\" (UID: \"7ae472d6-49bf-44a4-85a3-30e1dd169d3a\") " Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.072023 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae472d6-49bf-44a4-85a3-30e1dd169d3a-combined-ca-bundle\") pod \"7ae472d6-49bf-44a4-85a3-30e1dd169d3a\" (UID: \"7ae472d6-49bf-44a4-85a3-30e1dd169d3a\") " Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.072069 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbvlw\" (UniqueName: \"kubernetes.io/projected/7ae472d6-49bf-44a4-85a3-30e1dd169d3a-kube-api-access-fbvlw\") pod \"7ae472d6-49bf-44a4-85a3-30e1dd169d3a\" (UID: \"7ae472d6-49bf-44a4-85a3-30e1dd169d3a\") " Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.107672 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae472d6-49bf-44a4-85a3-30e1dd169d3a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7ae472d6-49bf-44a4-85a3-30e1dd169d3a" (UID: "7ae472d6-49bf-44a4-85a3-30e1dd169d3a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.107712 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ae472d6-49bf-44a4-85a3-30e1dd169d3a-kube-api-access-fbvlw" (OuterVolumeSpecName: "kube-api-access-fbvlw") pod "7ae472d6-49bf-44a4-85a3-30e1dd169d3a" (UID: "7ae472d6-49bf-44a4-85a3-30e1dd169d3a"). InnerVolumeSpecName "kube-api-access-fbvlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.145616 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae472d6-49bf-44a4-85a3-30e1dd169d3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ae472d6-49bf-44a4-85a3-30e1dd169d3a" (UID: "7ae472d6-49bf-44a4-85a3-30e1dd169d3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.163600 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae472d6-49bf-44a4-85a3-30e1dd169d3a-config-data" (OuterVolumeSpecName: "config-data") pod "7ae472d6-49bf-44a4-85a3-30e1dd169d3a" (UID: "7ae472d6-49bf-44a4-85a3-30e1dd169d3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.174609 4749 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ae472d6-49bf-44a4-85a3-30e1dd169d3a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.174647 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae472d6-49bf-44a4-85a3-30e1dd169d3a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.174658 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae472d6-49bf-44a4-85a3-30e1dd169d3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.174667 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbvlw\" (UniqueName: \"kubernetes.io/projected/7ae472d6-49bf-44a4-85a3-30e1dd169d3a-kube-api-access-fbvlw\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.177761 4749 scope.go:117] "RemoveContainer" containerID="0763b178530122b3e8e381a52b72b60d0103b38188e175f70597899afb88e2da" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.432239 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8272956e-b31a-4bd8-9118-3ca9721e6d75","Type":"ContainerStarted","Data":"b9c33dcbd975d5292194b9083be080f009bfefbf047450f7c5c000971c1c8246"} Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.434396 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qm6tz" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.434495 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qm6tz" event={"ID":"7ae472d6-49bf-44a4-85a3-30e1dd169d3a","Type":"ContainerDied","Data":"bbe8324f1293a150413b3046bba42bab693dce7514988586e6f5b40eeae33630"} Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.434720 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbe8324f1293a150413b3046bba42bab693dce7514988586e6f5b40eeae33630" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.436726 4749 generic.go:334] "Generic (PLEG): container finished" podID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerID="72a24d5f0786b3da9aac01d553c981fdcf13ebc1b2358317a489547c93d570db" exitCode=0 Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.436783 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerDied","Data":"72a24d5f0786b3da9aac01d553c981fdcf13ebc1b2358317a489547c93d570db"} Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.437111 4749 scope.go:117] "RemoveContainer" containerID="3c74897c54ef7454cef1084b8e06312bda867ecfea849b2a4ba3d53fa61618a4" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.437817 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerStarted","Data":"91c7008ee23efd7c3f17163220decd2750e56886d85eb31073250a59b5bc0138"} Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.830005 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-5vqsx"] Mar 20 07:32:35 crc kubenswrapper[4749]: E0320 07:32:35.830411 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83e26e83-f14d-4346-a537-332070445891" containerName="ovn-config" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.830429 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e26e83-f14d-4346-a537-332070445891" containerName="ovn-config" Mar 20 07:32:35 crc kubenswrapper[4749]: E0320 07:32:35.830441 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae472d6-49bf-44a4-85a3-30e1dd169d3a" containerName="glance-db-sync" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.830449 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae472d6-49bf-44a4-85a3-30e1dd169d3a" containerName="glance-db-sync" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.830604 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae472d6-49bf-44a4-85a3-30e1dd169d3a" containerName="glance-db-sync" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.830623 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="83e26e83-f14d-4346-a537-332070445891" containerName="ovn-config" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.836157 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.851838 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-5vqsx"] Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.886154 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/856b835b-2c70-4faa-a12a-43d6f59195f4-config\") pod \"dnsmasq-dns-5b946c75cc-5vqsx\" (UID: \"856b835b-2c70-4faa-a12a-43d6f59195f4\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.886226 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/856b835b-2c70-4faa-a12a-43d6f59195f4-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-5vqsx\" (UID: \"856b835b-2c70-4faa-a12a-43d6f59195f4\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.886290 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/856b835b-2c70-4faa-a12a-43d6f59195f4-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-5vqsx\" (UID: \"856b835b-2c70-4faa-a12a-43d6f59195f4\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.886312 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/856b835b-2c70-4faa-a12a-43d6f59195f4-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-5vqsx\" (UID: \"856b835b-2c70-4faa-a12a-43d6f59195f4\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.886355 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9p4r\" (UniqueName: \"kubernetes.io/projected/856b835b-2c70-4faa-a12a-43d6f59195f4-kube-api-access-l9p4r\") pod \"dnsmasq-dns-5b946c75cc-5vqsx\" (UID: \"856b835b-2c70-4faa-a12a-43d6f59195f4\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.986775 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9p4r\" (UniqueName: \"kubernetes.io/projected/856b835b-2c70-4faa-a12a-43d6f59195f4-kube-api-access-l9p4r\") pod \"dnsmasq-dns-5b946c75cc-5vqsx\" (UID: \"856b835b-2c70-4faa-a12a-43d6f59195f4\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.986849 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/856b835b-2c70-4faa-a12a-43d6f59195f4-config\") pod \"dnsmasq-dns-5b946c75cc-5vqsx\" (UID: \"856b835b-2c70-4faa-a12a-43d6f59195f4\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.986876 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/856b835b-2c70-4faa-a12a-43d6f59195f4-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-5vqsx\" (UID: \"856b835b-2c70-4faa-a12a-43d6f59195f4\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.986913 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/856b835b-2c70-4faa-a12a-43d6f59195f4-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-5vqsx\" (UID: \"856b835b-2c70-4faa-a12a-43d6f59195f4\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.986934 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/856b835b-2c70-4faa-a12a-43d6f59195f4-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-5vqsx\" (UID: \"856b835b-2c70-4faa-a12a-43d6f59195f4\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.987796 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/856b835b-2c70-4faa-a12a-43d6f59195f4-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-5vqsx\" (UID: \"856b835b-2c70-4faa-a12a-43d6f59195f4\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.988595 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/856b835b-2c70-4faa-a12a-43d6f59195f4-config\") pod \"dnsmasq-dns-5b946c75cc-5vqsx\" (UID: \"856b835b-2c70-4faa-a12a-43d6f59195f4\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.989110 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/856b835b-2c70-4faa-a12a-43d6f59195f4-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-5vqsx\" (UID: \"856b835b-2c70-4faa-a12a-43d6f59195f4\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" Mar 20 07:32:35 crc kubenswrapper[4749]: I0320 07:32:35.989897 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/856b835b-2c70-4faa-a12a-43d6f59195f4-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-5vqsx\" (UID: \"856b835b-2c70-4faa-a12a-43d6f59195f4\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" Mar 20 07:32:36 crc kubenswrapper[4749]: I0320 07:32:36.007293 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9p4r\" (UniqueName: \"kubernetes.io/projected/856b835b-2c70-4faa-a12a-43d6f59195f4-kube-api-access-l9p4r\") pod \"dnsmasq-dns-5b946c75cc-5vqsx\" (UID: \"856b835b-2c70-4faa-a12a-43d6f59195f4\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" Mar 20 07:32:36 crc kubenswrapper[4749]: I0320 07:32:36.161193 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" Mar 20 07:32:36 crc kubenswrapper[4749]: I0320 07:32:36.449647 4749 generic.go:334] "Generic (PLEG): container finished" podID="8db06e36-0b00-4157-9345-69449da3e85f" containerID="33bc94bce903b7afb8d3950455538835c2def1369c4cf0f1b11ec4712f53f659" exitCode=0 Mar 20 07:32:36 crc kubenswrapper[4749]: I0320 07:32:36.449728 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerDied","Data":"33bc94bce903b7afb8d3950455538835c2def1369c4cf0f1b11ec4712f53f659"} Mar 20 07:32:36 crc kubenswrapper[4749]: I0320 07:32:36.449996 4749 scope.go:117] "RemoveContainer" containerID="191a7ad9cc4bb6e1435c3a2ebbeae3bf07a7c9404bfba46f2c7d6deda1b286d4" Mar 20 07:32:36 crc kubenswrapper[4749]: I0320 07:32:36.450767 4749 scope.go:117] "RemoveContainer" containerID="33bc94bce903b7afb8d3950455538835c2def1369c4cf0f1b11ec4712f53f659" Mar 20 07:32:36 crc kubenswrapper[4749]: E0320 07:32:36.451040 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 10s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:32:36 crc kubenswrapper[4749]: I0320 07:32:36.469398 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerStarted","Data":"8899097ca64e738bbb8bc0dd1bf1ddd25ca9fa4e615bd16c8f57a09d2709c496"} Mar 20 07:32:36 crc kubenswrapper[4749]: I0320 07:32:36.470258 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:32:36 crc kubenswrapper[4749]: I0320 07:32:36.897218 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-5vqsx"] Mar 20 07:32:36 crc kubenswrapper[4749]: W0320 07:32:36.921956 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod856b835b_2c70_4faa_a12a_43d6f59195f4.slice/crio-e7f6693d0258dc9f6952b7ed90f51e4d189640a4e6fa9f24c00d470dfc1e86ef WatchSource:0}: Error finding container e7f6693d0258dc9f6952b7ed90f51e4d189640a4e6fa9f24c00d470dfc1e86ef: Status 404 returned error can't find the container with id e7f6693d0258dc9f6952b7ed90f51e4d189640a4e6fa9f24c00d470dfc1e86ef Mar 20 07:32:37 crc kubenswrapper[4749]: I0320 07:32:37.516388 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8272956e-b31a-4bd8-9118-3ca9721e6d75","Type":"ContainerStarted","Data":"62efce134b5b40ddb656942640a94a98da8062e2ab1ca319f601b3ac15a0505b"} Mar 20 07:32:37 crc kubenswrapper[4749]: I0320 07:32:37.516737 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8272956e-b31a-4bd8-9118-3ca9721e6d75","Type":"ContainerStarted","Data":"8370453c68b82553cb71993cb81f1de7023eb5289e0506e7732b7c33ee54a583"} Mar 20 07:32:37 crc kubenswrapper[4749]: I0320 07:32:37.516756 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8272956e-b31a-4bd8-9118-3ca9721e6d75","Type":"ContainerStarted","Data":"b10d6f6912b19dfcdba30ea9768bc48bab6735cf85b27ba34526767ac2cfe27a"} Mar 20 07:32:37 crc kubenswrapper[4749]: I0320 07:32:37.516769 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8272956e-b31a-4bd8-9118-3ca9721e6d75","Type":"ContainerStarted","Data":"e8ffd3b342b3918b34257d36d9d9afe13c22d88b1ca9bc305247619a99a7e7b7"} Mar 20 07:32:37 crc kubenswrapper[4749]: I0320 07:32:37.523436 4749 generic.go:334] "Generic (PLEG): container finished" podID="856b835b-2c70-4faa-a12a-43d6f59195f4" containerID="5cfc253b44e9c5875410308b13cec425bf973d810d8d634ba7562fe15b924465" exitCode=0 Mar 20 07:32:37 crc kubenswrapper[4749]: I0320 07:32:37.524814 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" event={"ID":"856b835b-2c70-4faa-a12a-43d6f59195f4","Type":"ContainerDied","Data":"5cfc253b44e9c5875410308b13cec425bf973d810d8d634ba7562fe15b924465"} Mar 20 07:32:37 crc kubenswrapper[4749]: I0320 07:32:37.524848 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" event={"ID":"856b835b-2c70-4faa-a12a-43d6f59195f4","Type":"ContainerStarted","Data":"e7f6693d0258dc9f6952b7ed90f51e4d189640a4e6fa9f24c00d470dfc1e86ef"} Mar 20 07:32:38 crc kubenswrapper[4749]: I0320 07:32:38.537901 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8272956e-b31a-4bd8-9118-3ca9721e6d75","Type":"ContainerStarted","Data":"c4c762decc327befd8958836c6bf498468adeddc4ad494c39a18d56703588d1f"} Mar 20 07:32:38 crc kubenswrapper[4749]: I0320 07:32:38.541423 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" event={"ID":"856b835b-2c70-4faa-a12a-43d6f59195f4","Type":"ContainerStarted","Data":"79557d091acc62b85af14507852b77df6d85efc7e3b724cc33d62415ffb87939"} Mar 20 07:32:40 crc kubenswrapper[4749]: I0320 07:32:40.563679 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8272956e-b31a-4bd8-9118-3ca9721e6d75","Type":"ContainerStarted","Data":"82fd878a82106106d84b6906374d58bc1f9fbae58059e5265cd33bdc4148a55a"} Mar 20 07:32:40 crc kubenswrapper[4749]: I0320 07:32:40.564250 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8272956e-b31a-4bd8-9118-3ca9721e6d75","Type":"ContainerStarted","Data":"36ce28f7d6221e20105aa0b3dc5aacb2bc1737f6ae88545e31871c311acc7c85"} Mar 20 07:32:40 crc kubenswrapper[4749]: I0320 07:32:40.566132 4749 generic.go:334] "Generic (PLEG): container finished" podID="8b9b402f-2d95-48f5-98d8-497d90956ba2" containerID="8899097ca64e738bbb8bc0dd1bf1ddd25ca9fa4e615bd16c8f57a09d2709c496" exitCode=0 Mar 20 07:32:40 crc kubenswrapper[4749]: I0320 07:32:40.566255 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerDied","Data":"8899097ca64e738bbb8bc0dd1bf1ddd25ca9fa4e615bd16c8f57a09d2709c496"} Mar 20 07:32:40 crc kubenswrapper[4749]: I0320 07:32:40.566380 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" Mar 20 07:32:40 crc kubenswrapper[4749]: I0320 07:32:40.566419 4749 scope.go:117] "RemoveContainer" containerID="0763b178530122b3e8e381a52b72b60d0103b38188e175f70597899afb88e2da" Mar 20 07:32:40 crc kubenswrapper[4749]: I0320 07:32:40.566819 4749 scope.go:117] "RemoveContainer" containerID="8899097ca64e738bbb8bc0dd1bf1ddd25ca9fa4e615bd16c8f57a09d2709c496" Mar 20 07:32:40 crc kubenswrapper[4749]: E0320 07:32:40.567134 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:32:40 crc kubenswrapper[4749]: I0320 07:32:40.618259 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=38.269567845 podStartE2EDuration="44.618231058s" podCreationTimestamp="2026-03-20 07:31:56 +0000 UTC" firstStartedPulling="2026-03-20 07:32:30.138074827 +0000 UTC m=+1186.687732484" lastFinishedPulling="2026-03-20 07:32:36.48673805 +0000 UTC m=+1193.036395697" observedRunningTime="2026-03-20 07:32:40.603119593 +0000 UTC m=+1197.152777250" watchObservedRunningTime="2026-03-20 07:32:40.618231058 +0000 UTC m=+1197.167888715" Mar 20 07:32:40 crc kubenswrapper[4749]: I0320 07:32:40.634343 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" podStartSLOduration=5.634276906 podStartE2EDuration="5.634276906s" podCreationTimestamp="2026-03-20 07:32:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:32:40.632814851 +0000 UTC m=+1197.182472528" watchObservedRunningTime="2026-03-20 07:32:40.634276906 +0000 UTC m=+1197.183934563" Mar 20 07:32:40 crc kubenswrapper[4749]: I0320 07:32:40.870568 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-5vqsx"] Mar 20 07:32:40 crc kubenswrapper[4749]: I0320 07:32:40.907859 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-l6jpr"] Mar 20 07:32:40 crc kubenswrapper[4749]: I0320 07:32:40.909510 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" Mar 20 07:32:40 crc kubenswrapper[4749]: I0320 07:32:40.911690 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 20 07:32:40 crc kubenswrapper[4749]: I0320 07:32:40.927125 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-l6jpr"] Mar 20 07:32:41 crc kubenswrapper[4749]: I0320 07:32:41.077689 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98b88bbe-3668-4194-840d-1ba64dd6c32e-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-l6jpr\" (UID: \"98b88bbe-3668-4194-840d-1ba64dd6c32e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" Mar 20 07:32:41 crc kubenswrapper[4749]: I0320 07:32:41.077864 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2727\" (UniqueName: \"kubernetes.io/projected/98b88bbe-3668-4194-840d-1ba64dd6c32e-kube-api-access-n2727\") pod \"dnsmasq-dns-74f6bcbc87-l6jpr\" (UID: \"98b88bbe-3668-4194-840d-1ba64dd6c32e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" Mar 20 07:32:41 crc kubenswrapper[4749]: I0320 07:32:41.077908 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98b88bbe-3668-4194-840d-1ba64dd6c32e-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-l6jpr\" (UID: \"98b88bbe-3668-4194-840d-1ba64dd6c32e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" Mar 20 07:32:41 crc kubenswrapper[4749]: I0320 07:32:41.077940 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98b88bbe-3668-4194-840d-1ba64dd6c32e-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-l6jpr\" (UID: \"98b88bbe-3668-4194-840d-1ba64dd6c32e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" Mar 20 07:32:41 crc kubenswrapper[4749]: I0320 07:32:41.078002 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98b88bbe-3668-4194-840d-1ba64dd6c32e-config\") pod \"dnsmasq-dns-74f6bcbc87-l6jpr\" (UID: \"98b88bbe-3668-4194-840d-1ba64dd6c32e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" Mar 20 07:32:41 crc kubenswrapper[4749]: I0320 07:32:41.078033 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98b88bbe-3668-4194-840d-1ba64dd6c32e-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-l6jpr\" (UID: \"98b88bbe-3668-4194-840d-1ba64dd6c32e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" Mar 20 07:32:41 crc kubenswrapper[4749]: I0320 07:32:41.190922 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2727\" (UniqueName: \"kubernetes.io/projected/98b88bbe-3668-4194-840d-1ba64dd6c32e-kube-api-access-n2727\") pod \"dnsmasq-dns-74f6bcbc87-l6jpr\" (UID: \"98b88bbe-3668-4194-840d-1ba64dd6c32e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" Mar 20 07:32:41 crc kubenswrapper[4749]: I0320 07:32:41.191663 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98b88bbe-3668-4194-840d-1ba64dd6c32e-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-l6jpr\" (UID: \"98b88bbe-3668-4194-840d-1ba64dd6c32e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" Mar 20 07:32:41 crc kubenswrapper[4749]: I0320 07:32:41.193064 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98b88bbe-3668-4194-840d-1ba64dd6c32e-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-l6jpr\" (UID: \"98b88bbe-3668-4194-840d-1ba64dd6c32e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" Mar 20 07:32:41 crc kubenswrapper[4749]: I0320 07:32:41.193229 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98b88bbe-3668-4194-840d-1ba64dd6c32e-config\") pod \"dnsmasq-dns-74f6bcbc87-l6jpr\" (UID: \"98b88bbe-3668-4194-840d-1ba64dd6c32e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" Mar 20 07:32:41 crc kubenswrapper[4749]: I0320 07:32:41.193277 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98b88bbe-3668-4194-840d-1ba64dd6c32e-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-l6jpr\" (UID: \"98b88bbe-3668-4194-840d-1ba64dd6c32e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" Mar 20 07:32:41 crc kubenswrapper[4749]: I0320 07:32:41.193494 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98b88bbe-3668-4194-840d-1ba64dd6c32e-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-l6jpr\" (UID: \"98b88bbe-3668-4194-840d-1ba64dd6c32e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" Mar 20 07:32:41 crc kubenswrapper[4749]: I0320 07:32:41.194552 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98b88bbe-3668-4194-840d-1ba64dd6c32e-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-l6jpr\" (UID: \"98b88bbe-3668-4194-840d-1ba64dd6c32e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" Mar 20 07:32:41 crc kubenswrapper[4749]: I0320 07:32:41.195603 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98b88bbe-3668-4194-840d-1ba64dd6c32e-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-l6jpr\" (UID: \"98b88bbe-3668-4194-840d-1ba64dd6c32e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" Mar 20 07:32:41 crc kubenswrapper[4749]: I0320 07:32:41.195692 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98b88bbe-3668-4194-840d-1ba64dd6c32e-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-l6jpr\" (UID: \"98b88bbe-3668-4194-840d-1ba64dd6c32e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" Mar 20 07:32:41 crc kubenswrapper[4749]: I0320 07:32:41.207604 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98b88bbe-3668-4194-840d-1ba64dd6c32e-config\") pod \"dnsmasq-dns-74f6bcbc87-l6jpr\" (UID: \"98b88bbe-3668-4194-840d-1ba64dd6c32e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" Mar 20 07:32:41 crc kubenswrapper[4749]: I0320 07:32:41.207671 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98b88bbe-3668-4194-840d-1ba64dd6c32e-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-l6jpr\" (UID: \"98b88bbe-3668-4194-840d-1ba64dd6c32e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" Mar 20 07:32:41 crc kubenswrapper[4749]: I0320 07:32:41.213739 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2727\" (UniqueName: \"kubernetes.io/projected/98b88bbe-3668-4194-840d-1ba64dd6c32e-kube-api-access-n2727\") pod \"dnsmasq-dns-74f6bcbc87-l6jpr\" (UID: \"98b88bbe-3668-4194-840d-1ba64dd6c32e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" Mar 20 07:32:41 crc kubenswrapper[4749]: I0320 07:32:41.229155 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" Mar 20 07:32:41 crc kubenswrapper[4749]: I0320 07:32:41.703626 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-l6jpr"] Mar 20 07:32:42 crc kubenswrapper[4749]: I0320 07:32:42.594642 4749 generic.go:334] "Generic (PLEG): container finished" podID="98b88bbe-3668-4194-840d-1ba64dd6c32e" containerID="f07f5095b86b869a439beeb06142e157f94d6013bcbf71a61741b51fec6399ce" exitCode=0 Mar 20 07:32:42 crc kubenswrapper[4749]: I0320 07:32:42.594736 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" event={"ID":"98b88bbe-3668-4194-840d-1ba64dd6c32e","Type":"ContainerDied","Data":"f07f5095b86b869a439beeb06142e157f94d6013bcbf71a61741b51fec6399ce"} Mar 20 07:32:42 crc kubenswrapper[4749]: I0320 07:32:42.595141 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" event={"ID":"98b88bbe-3668-4194-840d-1ba64dd6c32e","Type":"ContainerStarted","Data":"ef489d9ee52482bc80e4d9b3d0442e0ef774298546dffb52385ec14843c0bebd"} Mar 20 07:32:42 crc kubenswrapper[4749]: I0320 07:32:42.595473 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" podUID="856b835b-2c70-4faa-a12a-43d6f59195f4" containerName="dnsmasq-dns" containerID="cri-o://79557d091acc62b85af14507852b77df6d85efc7e3b724cc33d62415ffb87939" gracePeriod=10 Mar 20 07:32:42 crc kubenswrapper[4749]: I0320 07:32:42.598032 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.026963 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.127816 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/856b835b-2c70-4faa-a12a-43d6f59195f4-config\") pod \"856b835b-2c70-4faa-a12a-43d6f59195f4\" (UID: \"856b835b-2c70-4faa-a12a-43d6f59195f4\") " Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.127921 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/856b835b-2c70-4faa-a12a-43d6f59195f4-ovsdbserver-nb\") pod \"856b835b-2c70-4faa-a12a-43d6f59195f4\" (UID: \"856b835b-2c70-4faa-a12a-43d6f59195f4\") " Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.127992 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/856b835b-2c70-4faa-a12a-43d6f59195f4-ovsdbserver-sb\") pod \"856b835b-2c70-4faa-a12a-43d6f59195f4\" (UID: \"856b835b-2c70-4faa-a12a-43d6f59195f4\") " Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.128033 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/856b835b-2c70-4faa-a12a-43d6f59195f4-dns-svc\") pod \"856b835b-2c70-4faa-a12a-43d6f59195f4\" (UID: \"856b835b-2c70-4faa-a12a-43d6f59195f4\") " Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.128086 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9p4r\" (UniqueName: \"kubernetes.io/projected/856b835b-2c70-4faa-a12a-43d6f59195f4-kube-api-access-l9p4r\") pod \"856b835b-2c70-4faa-a12a-43d6f59195f4\" (UID: \"856b835b-2c70-4faa-a12a-43d6f59195f4\") " Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.134001 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/856b835b-2c70-4faa-a12a-43d6f59195f4-kube-api-access-l9p4r" (OuterVolumeSpecName: "kube-api-access-l9p4r") pod "856b835b-2c70-4faa-a12a-43d6f59195f4" (UID: "856b835b-2c70-4faa-a12a-43d6f59195f4"). InnerVolumeSpecName "kube-api-access-l9p4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.165491 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/856b835b-2c70-4faa-a12a-43d6f59195f4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "856b835b-2c70-4faa-a12a-43d6f59195f4" (UID: "856b835b-2c70-4faa-a12a-43d6f59195f4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.166336 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/856b835b-2c70-4faa-a12a-43d6f59195f4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "856b835b-2c70-4faa-a12a-43d6f59195f4" (UID: "856b835b-2c70-4faa-a12a-43d6f59195f4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.171249 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/856b835b-2c70-4faa-a12a-43d6f59195f4-config" (OuterVolumeSpecName: "config") pod "856b835b-2c70-4faa-a12a-43d6f59195f4" (UID: "856b835b-2c70-4faa-a12a-43d6f59195f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.180251 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/856b835b-2c70-4faa-a12a-43d6f59195f4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "856b835b-2c70-4faa-a12a-43d6f59195f4" (UID: "856b835b-2c70-4faa-a12a-43d6f59195f4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.230741 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/856b835b-2c70-4faa-a12a-43d6f59195f4-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.230804 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/856b835b-2c70-4faa-a12a-43d6f59195f4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.230835 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/856b835b-2c70-4faa-a12a-43d6f59195f4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.230861 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/856b835b-2c70-4faa-a12a-43d6f59195f4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.230883 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9p4r\" (UniqueName: \"kubernetes.io/projected/856b835b-2c70-4faa-a12a-43d6f59195f4-kube-api-access-l9p4r\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.610613 4749 generic.go:334] "Generic (PLEG): container finished" podID="856b835b-2c70-4faa-a12a-43d6f59195f4" containerID="79557d091acc62b85af14507852b77df6d85efc7e3b724cc33d62415ffb87939" exitCode=0 Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.610738 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.610762 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" event={"ID":"856b835b-2c70-4faa-a12a-43d6f59195f4","Type":"ContainerDied","Data":"79557d091acc62b85af14507852b77df6d85efc7e3b724cc33d62415ffb87939"} Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.611275 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-5vqsx" event={"ID":"856b835b-2c70-4faa-a12a-43d6f59195f4","Type":"ContainerDied","Data":"e7f6693d0258dc9f6952b7ed90f51e4d189640a4e6fa9f24c00d470dfc1e86ef"} Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.611343 4749 scope.go:117] "RemoveContainer" containerID="79557d091acc62b85af14507852b77df6d85efc7e3b724cc33d62415ffb87939" Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.616599 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" event={"ID":"98b88bbe-3668-4194-840d-1ba64dd6c32e","Type":"ContainerStarted","Data":"dab6ff935a1562c566572b400b57826331cef702030478893e22e5497ef3a904"} Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.616766 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.648114 4749 scope.go:117] "RemoveContainer" containerID="5cfc253b44e9c5875410308b13cec425bf973d810d8d634ba7562fe15b924465" Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.656948 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" podStartSLOduration=3.6569275660000002 podStartE2EDuration="3.656927566s" podCreationTimestamp="2026-03-20 07:32:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 07:32:43.648376019 +0000 UTC m=+1200.198033706" watchObservedRunningTime="2026-03-20 07:32:43.656927566 +0000 UTC m=+1200.206585213" Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.674980 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-5vqsx"] Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.684839 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-5vqsx"] Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.687994 4749 scope.go:117] "RemoveContainer" containerID="79557d091acc62b85af14507852b77df6d85efc7e3b724cc33d62415ffb87939" Mar 20 07:32:43 crc kubenswrapper[4749]: E0320 07:32:43.688661 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79557d091acc62b85af14507852b77df6d85efc7e3b724cc33d62415ffb87939\": container with ID starting with 79557d091acc62b85af14507852b77df6d85efc7e3b724cc33d62415ffb87939 not found: ID does not exist" containerID="79557d091acc62b85af14507852b77df6d85efc7e3b724cc33d62415ffb87939" Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.688723 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79557d091acc62b85af14507852b77df6d85efc7e3b724cc33d62415ffb87939"} err="failed to get container status \"79557d091acc62b85af14507852b77df6d85efc7e3b724cc33d62415ffb87939\": rpc error: code = NotFound desc = could not find container \"79557d091acc62b85af14507852b77df6d85efc7e3b724cc33d62415ffb87939\": container with ID starting with 79557d091acc62b85af14507852b77df6d85efc7e3b724cc33d62415ffb87939 not found: ID does not exist" Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.688747 4749 scope.go:117] "RemoveContainer" containerID="5cfc253b44e9c5875410308b13cec425bf973d810d8d634ba7562fe15b924465" Mar 20 07:32:43 crc kubenswrapper[4749]: E0320 07:32:43.689056 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cfc253b44e9c5875410308b13cec425bf973d810d8d634ba7562fe15b924465\": container with ID starting with 5cfc253b44e9c5875410308b13cec425bf973d810d8d634ba7562fe15b924465 not found: ID does not exist" containerID="5cfc253b44e9c5875410308b13cec425bf973d810d8d634ba7562fe15b924465" Mar 20 07:32:43 crc kubenswrapper[4749]: I0320 07:32:43.689144 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cfc253b44e9c5875410308b13cec425bf973d810d8d634ba7562fe15b924465"} err="failed to get container status \"5cfc253b44e9c5875410308b13cec425bf973d810d8d634ba7562fe15b924465\": rpc error: code = NotFound desc = could not find container \"5cfc253b44e9c5875410308b13cec425bf973d810d8d634ba7562fe15b924465\": container with ID starting with 5cfc253b44e9c5875410308b13cec425bf973d810d8d634ba7562fe15b924465 not found: ID does not exist" Mar 20 07:32:44 crc kubenswrapper[4749]: I0320 07:32:44.193054 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="856b835b-2c70-4faa-a12a-43d6f59195f4" path="/var/lib/kubelet/pods/856b835b-2c70-4faa-a12a-43d6f59195f4/volumes" Mar 20 07:32:51 crc kubenswrapper[4749]: I0320 07:32:51.177851 4749 scope.go:117] "RemoveContainer" containerID="33bc94bce903b7afb8d3950455538835c2def1369c4cf0f1b11ec4712f53f659" Mar 20 07:32:51 crc kubenswrapper[4749]: I0320 07:32:51.231546 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-l6jpr" Mar 20 07:32:51 crc kubenswrapper[4749]: I0320 07:32:51.326130 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-f5tdv"] Mar 20 07:32:51 crc kubenswrapper[4749]: I0320 07:32:51.329201 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-f5tdv" podUID="b30e9b27-acb7-4df2-b745-482fe080f360" containerName="dnsmasq-dns" containerID="cri-o://cfcbb53039c573bf388a044b8c8a51cccd81128168646e2bdf5cb23961134a3d" gracePeriod=10 Mar 20 07:32:51 crc kubenswrapper[4749]: I0320 07:32:51.703510 4749 generic.go:334] "Generic (PLEG): container finished" podID="b30e9b27-acb7-4df2-b745-482fe080f360" containerID="cfcbb53039c573bf388a044b8c8a51cccd81128168646e2bdf5cb23961134a3d" exitCode=0 Mar 20 07:32:51 crc kubenswrapper[4749]: I0320 07:32:51.703775 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-f5tdv" event={"ID":"b30e9b27-acb7-4df2-b745-482fe080f360","Type":"ContainerDied","Data":"cfcbb53039c573bf388a044b8c8a51cccd81128168646e2bdf5cb23961134a3d"} Mar 20 07:32:51 crc kubenswrapper[4749]: I0320 07:32:51.707906 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerStarted","Data":"98d0e15e8899a27cd3cbef468766adedcde815315aef384d07e45d58e9d77b1c"} Mar 20 07:32:51 crc kubenswrapper[4749]: I0320 07:32:51.708584 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 07:32:51 crc kubenswrapper[4749]: I0320 07:32:51.827493 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-f5tdv" Mar 20 07:32:51 crc kubenswrapper[4749]: I0320 07:32:51.988861 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b30e9b27-acb7-4df2-b745-482fe080f360-dns-svc\") pod \"b30e9b27-acb7-4df2-b745-482fe080f360\" (UID: \"b30e9b27-acb7-4df2-b745-482fe080f360\") " Mar 20 07:32:51 crc kubenswrapper[4749]: I0320 07:32:51.988904 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b30e9b27-acb7-4df2-b745-482fe080f360-ovsdbserver-nb\") pod \"b30e9b27-acb7-4df2-b745-482fe080f360\" (UID: \"b30e9b27-acb7-4df2-b745-482fe080f360\") " Mar 20 07:32:51 crc kubenswrapper[4749]: I0320 07:32:51.988960 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b30e9b27-acb7-4df2-b745-482fe080f360-config\") pod \"b30e9b27-acb7-4df2-b745-482fe080f360\" (UID: \"b30e9b27-acb7-4df2-b745-482fe080f360\") " Mar 20 07:32:51 crc kubenswrapper[4749]: I0320 07:32:51.989022 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9btfr\" (UniqueName: \"kubernetes.io/projected/b30e9b27-acb7-4df2-b745-482fe080f360-kube-api-access-9btfr\") pod \"b30e9b27-acb7-4df2-b745-482fe080f360\" (UID: \"b30e9b27-acb7-4df2-b745-482fe080f360\") " Mar 20 07:32:51 crc kubenswrapper[4749]: I0320 07:32:51.989047 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b30e9b27-acb7-4df2-b745-482fe080f360-ovsdbserver-sb\") pod \"b30e9b27-acb7-4df2-b745-482fe080f360\" (UID: \"b30e9b27-acb7-4df2-b745-482fe080f360\") " Mar 20 07:32:51 crc kubenswrapper[4749]: I0320 07:32:51.994802 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b30e9b27-acb7-4df2-b745-482fe080f360-kube-api-access-9btfr" (OuterVolumeSpecName: "kube-api-access-9btfr") pod "b30e9b27-acb7-4df2-b745-482fe080f360" (UID: "b30e9b27-acb7-4df2-b745-482fe080f360"). InnerVolumeSpecName "kube-api-access-9btfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:32:52 crc kubenswrapper[4749]: I0320 07:32:52.027761 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b30e9b27-acb7-4df2-b745-482fe080f360-config" (OuterVolumeSpecName: "config") pod "b30e9b27-acb7-4df2-b745-482fe080f360" (UID: "b30e9b27-acb7-4df2-b745-482fe080f360"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:32:52 crc kubenswrapper[4749]: I0320 07:32:52.029825 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b30e9b27-acb7-4df2-b745-482fe080f360-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b30e9b27-acb7-4df2-b745-482fe080f360" (UID: "b30e9b27-acb7-4df2-b745-482fe080f360"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:32:52 crc kubenswrapper[4749]: I0320 07:32:52.029873 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b30e9b27-acb7-4df2-b745-482fe080f360-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b30e9b27-acb7-4df2-b745-482fe080f360" (UID: "b30e9b27-acb7-4df2-b745-482fe080f360"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:32:52 crc kubenswrapper[4749]: I0320 07:32:52.039105 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b30e9b27-acb7-4df2-b745-482fe080f360-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b30e9b27-acb7-4df2-b745-482fe080f360" (UID: "b30e9b27-acb7-4df2-b745-482fe080f360"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:32:52 crc kubenswrapper[4749]: I0320 07:32:52.090594 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b30e9b27-acb7-4df2-b745-482fe080f360-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:52 crc kubenswrapper[4749]: I0320 07:32:52.090627 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b30e9b27-acb7-4df2-b745-482fe080f360-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:52 crc kubenswrapper[4749]: I0320 07:32:52.090638 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b30e9b27-acb7-4df2-b745-482fe080f360-config\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:52 crc kubenswrapper[4749]: I0320 07:32:52.090648 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9btfr\" (UniqueName: \"kubernetes.io/projected/b30e9b27-acb7-4df2-b745-482fe080f360-kube-api-access-9btfr\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:52 crc kubenswrapper[4749]: I0320 07:32:52.090657 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b30e9b27-acb7-4df2-b745-482fe080f360-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 07:32:52 crc kubenswrapper[4749]: I0320 07:32:52.176995 4749 scope.go:117] "RemoveContainer" containerID="8899097ca64e738bbb8bc0dd1bf1ddd25ca9fa4e615bd16c8f57a09d2709c496" Mar 20 07:32:52 crc kubenswrapper[4749]: E0320 07:32:52.177331 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:32:52 crc kubenswrapper[4749]: I0320 07:32:52.718344 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-f5tdv" event={"ID":"b30e9b27-acb7-4df2-b745-482fe080f360","Type":"ContainerDied","Data":"1c7761b5b2d50de8a15920d4f1291212010e8a0c8bb15e87825baac84ae8e300"} Mar 20 07:32:52 crc kubenswrapper[4749]: I0320 07:32:52.718763 4749 scope.go:117] "RemoveContainer" containerID="cfcbb53039c573bf388a044b8c8a51cccd81128168646e2bdf5cb23961134a3d" Mar 20 07:32:52 crc kubenswrapper[4749]: I0320 07:32:52.718392 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-f5tdv" Mar 20 07:32:52 crc kubenswrapper[4749]: I0320 07:32:52.746445 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-f5tdv"] Mar 20 07:32:52 crc kubenswrapper[4749]: I0320 07:32:52.746731 4749 scope.go:117] "RemoveContainer" containerID="d9ebc69cd32fea04c5ad0f326e861b49fe2bfdfd0c4f00aff870db5d07fe495b" Mar 20 07:32:52 crc kubenswrapper[4749]: I0320 07:32:52.756034 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-f5tdv"] Mar 20 07:32:54 crc kubenswrapper[4749]: I0320 07:32:54.189195 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b30e9b27-acb7-4df2-b745-482fe080f360" path="/var/lib/kubelet/pods/b30e9b27-acb7-4df2-b745-482fe080f360/volumes" Mar 20 07:32:54 crc kubenswrapper[4749]: I0320 07:32:54.345133 4749 scope.go:117] "RemoveContainer" containerID="fcaf7400616e181e809747406d4a18533122f17f08dbf981631d0288f4bec979" Mar 20 07:32:55 crc kubenswrapper[4749]: I0320 07:32:55.768251 4749 generic.go:334] "Generic (PLEG): container finished" podID="8db06e36-0b00-4157-9345-69449da3e85f" containerID="98d0e15e8899a27cd3cbef468766adedcde815315aef384d07e45d58e9d77b1c" exitCode=0 Mar 20 07:32:55 crc kubenswrapper[4749]: I0320 07:32:55.768434 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerDied","Data":"98d0e15e8899a27cd3cbef468766adedcde815315aef384d07e45d58e9d77b1c"} Mar 20 07:32:55 crc kubenswrapper[4749]: I0320 07:32:55.769544 4749 scope.go:117] "RemoveContainer" containerID="33bc94bce903b7afb8d3950455538835c2def1369c4cf0f1b11ec4712f53f659" Mar 20 07:32:55 crc kubenswrapper[4749]: I0320 07:32:55.776458 4749 scope.go:117] "RemoveContainer" containerID="98d0e15e8899a27cd3cbef468766adedcde815315aef384d07e45d58e9d77b1c" Mar 20 07:32:55 crc kubenswrapper[4749]: E0320 07:32:55.776913 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:33:07 crc kubenswrapper[4749]: I0320 07:33:07.178231 4749 scope.go:117] "RemoveContainer" containerID="8899097ca64e738bbb8bc0dd1bf1ddd25ca9fa4e615bd16c8f57a09d2709c496" Mar 20 07:33:07 crc kubenswrapper[4749]: I0320 07:33:07.912826 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerStarted","Data":"6c36a6ea0baf5628a75b2c44cb4a0f47e1044940f30f6730bab6f5c41ad9f798"} Mar 20 07:33:07 crc kubenswrapper[4749]: I0320 07:33:07.913238 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:33:09 crc kubenswrapper[4749]: I0320 07:33:09.177838 4749 scope.go:117] "RemoveContainer" containerID="98d0e15e8899a27cd3cbef468766adedcde815315aef384d07e45d58e9d77b1c" Mar 20 07:33:09 crc kubenswrapper[4749]: E0320 07:33:09.178594 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:33:11 crc kubenswrapper[4749]: I0320 07:33:11.969525 4749 generic.go:334] "Generic (PLEG): container finished" podID="8b9b402f-2d95-48f5-98d8-497d90956ba2" containerID="6c36a6ea0baf5628a75b2c44cb4a0f47e1044940f30f6730bab6f5c41ad9f798" exitCode=0 Mar 20 07:33:11 crc kubenswrapper[4749]: I0320 07:33:11.969593 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerDied","Data":"6c36a6ea0baf5628a75b2c44cb4a0f47e1044940f30f6730bab6f5c41ad9f798"} Mar 20 07:33:11 crc kubenswrapper[4749]: I0320 07:33:11.969852 4749 scope.go:117] "RemoveContainer" containerID="8899097ca64e738bbb8bc0dd1bf1ddd25ca9fa4e615bd16c8f57a09d2709c496" Mar 20 07:33:11 crc kubenswrapper[4749]: I0320 07:33:11.970682 4749 scope.go:117] "RemoveContainer" containerID="6c36a6ea0baf5628a75b2c44cb4a0f47e1044940f30f6730bab6f5c41ad9f798" Mar 20 07:33:11 crc kubenswrapper[4749]: E0320 07:33:11.971184 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:33:22 crc kubenswrapper[4749]: I0320 07:33:22.178370 4749 scope.go:117] "RemoveContainer" containerID="98d0e15e8899a27cd3cbef468766adedcde815315aef384d07e45d58e9d77b1c" Mar 20 07:33:23 crc kubenswrapper[4749]: I0320 07:33:23.091796 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerStarted","Data":"abfb501d09635783023288c1b0642636c97c9d32a57499b1c998fb18bdcb12af"} Mar 20 07:33:23 crc kubenswrapper[4749]: I0320 07:33:23.092525 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 07:33:26 crc kubenswrapper[4749]: I0320 07:33:26.180563 4749 scope.go:117] "RemoveContainer" containerID="6c36a6ea0baf5628a75b2c44cb4a0f47e1044940f30f6730bab6f5c41ad9f798" Mar 20 07:33:26 crc kubenswrapper[4749]: E0320 07:33:26.181755 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:33:27 crc kubenswrapper[4749]: I0320 07:33:27.137231 4749 generic.go:334] "Generic (PLEG): container finished" podID="8db06e36-0b00-4157-9345-69449da3e85f" containerID="abfb501d09635783023288c1b0642636c97c9d32a57499b1c998fb18bdcb12af" exitCode=0 Mar 20 07:33:27 crc kubenswrapper[4749]: I0320 07:33:27.137320 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerDied","Data":"abfb501d09635783023288c1b0642636c97c9d32a57499b1c998fb18bdcb12af"} Mar 20 07:33:27 crc kubenswrapper[4749]: I0320 07:33:27.137366 4749 scope.go:117] "RemoveContainer" containerID="98d0e15e8899a27cd3cbef468766adedcde815315aef384d07e45d58e9d77b1c" Mar 20 07:33:27 crc kubenswrapper[4749]: I0320 07:33:27.138400 4749 scope.go:117] "RemoveContainer" containerID="abfb501d09635783023288c1b0642636c97c9d32a57499b1c998fb18bdcb12af" Mar 20 07:33:27 crc kubenswrapper[4749]: E0320 07:33:27.138951 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:33:39 crc kubenswrapper[4749]: I0320 07:33:39.177493 4749 scope.go:117] "RemoveContainer" containerID="abfb501d09635783023288c1b0642636c97c9d32a57499b1c998fb18bdcb12af" Mar 20 07:33:39 crc kubenswrapper[4749]: E0320 07:33:39.178233 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:33:40 crc kubenswrapper[4749]: I0320 07:33:40.176832 4749 scope.go:117] "RemoveContainer" containerID="6c36a6ea0baf5628a75b2c44cb4a0f47e1044940f30f6730bab6f5c41ad9f798" Mar 20 07:33:40 crc kubenswrapper[4749]: E0320 07:33:40.177114 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:33:53 crc kubenswrapper[4749]: I0320 07:33:53.177820 4749 scope.go:117] "RemoveContainer" containerID="6c36a6ea0baf5628a75b2c44cb4a0f47e1044940f30f6730bab6f5c41ad9f798" Mar 20 07:33:53 crc kubenswrapper[4749]: I0320 07:33:53.178528 4749 scope.go:117] "RemoveContainer" containerID="abfb501d09635783023288c1b0642636c97c9d32a57499b1c998fb18bdcb12af" Mar 20 07:33:53 crc kubenswrapper[4749]: E0320 07:33:53.178959 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:33:53 crc kubenswrapper[4749]: I0320 07:33:53.426538 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerStarted","Data":"30d6e3a4936161683a4ed97833585ac289e397699ec7d23eb7c441c705709689"} Mar 20 07:33:53 crc kubenswrapper[4749]: I0320 07:33:53.427964 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:33:57 crc kubenswrapper[4749]: I0320 07:33:57.473875 4749 generic.go:334] "Generic (PLEG): container finished" podID="8b9b402f-2d95-48f5-98d8-497d90956ba2" containerID="30d6e3a4936161683a4ed97833585ac289e397699ec7d23eb7c441c705709689" exitCode=0 Mar 20 07:33:57 crc kubenswrapper[4749]: I0320 07:33:57.473984 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerDied","Data":"30d6e3a4936161683a4ed97833585ac289e397699ec7d23eb7c441c705709689"} Mar 20 07:33:57 crc kubenswrapper[4749]: I0320 07:33:57.474406 4749 scope.go:117] "RemoveContainer" containerID="6c36a6ea0baf5628a75b2c44cb4a0f47e1044940f30f6730bab6f5c41ad9f798" Mar 20 07:33:57 crc kubenswrapper[4749]: I0320 07:33:57.476023 4749 scope.go:117] "RemoveContainer" containerID="30d6e3a4936161683a4ed97833585ac289e397699ec7d23eb7c441c705709689" Mar 20 07:33:57 crc kubenswrapper[4749]: E0320 07:33:57.476613 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:34:00 crc kubenswrapper[4749]: I0320 07:34:00.157576 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566534-68w5t"] Mar 20 07:34:00 crc kubenswrapper[4749]: E0320 07:34:00.158382 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="856b835b-2c70-4faa-a12a-43d6f59195f4" containerName="init" Mar 20 07:34:00 crc kubenswrapper[4749]: I0320 07:34:00.158401 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="856b835b-2c70-4faa-a12a-43d6f59195f4" containerName="init" Mar 20 07:34:00 crc kubenswrapper[4749]: E0320 07:34:00.158415 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b30e9b27-acb7-4df2-b745-482fe080f360" containerName="dnsmasq-dns" Mar 20 07:34:00 crc kubenswrapper[4749]: I0320 07:34:00.158424 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30e9b27-acb7-4df2-b745-482fe080f360" containerName="dnsmasq-dns" Mar 20 07:34:00 crc kubenswrapper[4749]: E0320 07:34:00.158456 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="856b835b-2c70-4faa-a12a-43d6f59195f4" containerName="dnsmasq-dns" Mar 20 07:34:00 crc kubenswrapper[4749]: I0320 07:34:00.158465 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="856b835b-2c70-4faa-a12a-43d6f59195f4" containerName="dnsmasq-dns" Mar 20 07:34:00 crc kubenswrapper[4749]: E0320 07:34:00.158482 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b30e9b27-acb7-4df2-b745-482fe080f360" containerName="init" Mar 20 07:34:00 crc kubenswrapper[4749]: I0320 07:34:00.158490 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30e9b27-acb7-4df2-b745-482fe080f360" containerName="init" Mar 20 07:34:00 crc kubenswrapper[4749]: I0320 07:34:00.158680 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b30e9b27-acb7-4df2-b745-482fe080f360" containerName="dnsmasq-dns" Mar 20 07:34:00 crc kubenswrapper[4749]: I0320 07:34:00.158702 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="856b835b-2c70-4faa-a12a-43d6f59195f4" containerName="dnsmasq-dns" Mar 20 07:34:00 crc kubenswrapper[4749]: I0320 07:34:00.159407 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566534-68w5t" Mar 20 07:34:00 crc kubenswrapper[4749]: I0320 07:34:00.161755 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:34:00 crc kubenswrapper[4749]: I0320 07:34:00.162191 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:34:00 crc kubenswrapper[4749]: I0320 07:34:00.162510 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 07:34:00 crc kubenswrapper[4749]: I0320 07:34:00.167221 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566534-68w5t"] Mar 20 07:34:00 crc kubenswrapper[4749]: I0320 07:34:00.232838 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgtwp\" (UniqueName: \"kubernetes.io/projected/3f631d0f-f106-44f8-93f6-6e16924f5931-kube-api-access-dgtwp\") pod \"auto-csr-approver-29566534-68w5t\" (UID: \"3f631d0f-f106-44f8-93f6-6e16924f5931\") " pod="openshift-infra/auto-csr-approver-29566534-68w5t" Mar 20 07:34:00 crc kubenswrapper[4749]: I0320 07:34:00.338931 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgtwp\" (UniqueName: \"kubernetes.io/projected/3f631d0f-f106-44f8-93f6-6e16924f5931-kube-api-access-dgtwp\") pod \"auto-csr-approver-29566534-68w5t\" (UID: \"3f631d0f-f106-44f8-93f6-6e16924f5931\") " pod="openshift-infra/auto-csr-approver-29566534-68w5t" Mar 20 07:34:00 crc kubenswrapper[4749]: I0320 07:34:00.376102 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgtwp\" (UniqueName: \"kubernetes.io/projected/3f631d0f-f106-44f8-93f6-6e16924f5931-kube-api-access-dgtwp\") pod \"auto-csr-approver-29566534-68w5t\" (UID: \"3f631d0f-f106-44f8-93f6-6e16924f5931\") " pod="openshift-infra/auto-csr-approver-29566534-68w5t" Mar 20 07:34:00 crc kubenswrapper[4749]: I0320 07:34:00.489401 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566534-68w5t" Mar 20 07:34:01 crc kubenswrapper[4749]: I0320 07:34:01.258627 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566534-68w5t"] Mar 20 07:34:01 crc kubenswrapper[4749]: W0320 07:34:01.264863 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f631d0f_f106_44f8_93f6_6e16924f5931.slice/crio-a625483a2fcbd802a2cc12673df38855314a6902e38a67dd3256cb48484f53d5 WatchSource:0}: Error finding container a625483a2fcbd802a2cc12673df38855314a6902e38a67dd3256cb48484f53d5: Status 404 returned error can't find the container with id a625483a2fcbd802a2cc12673df38855314a6902e38a67dd3256cb48484f53d5 Mar 20 07:34:01 crc kubenswrapper[4749]: I0320 07:34:01.520495 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566534-68w5t" event={"ID":"3f631d0f-f106-44f8-93f6-6e16924f5931","Type":"ContainerStarted","Data":"a625483a2fcbd802a2cc12673df38855314a6902e38a67dd3256cb48484f53d5"} Mar 20 07:34:02 crc kubenswrapper[4749]: I0320 07:34:02.531544 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566534-68w5t" event={"ID":"3f631d0f-f106-44f8-93f6-6e16924f5931","Type":"ContainerStarted","Data":"e7aba9a1598ed3930860f678ea88023de80376569b971db9a3061df46e140a21"} Mar 20 07:34:02 crc kubenswrapper[4749]: I0320 07:34:02.550758 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566534-68w5t" podStartSLOduration=1.696242651 podStartE2EDuration="2.550739741s" podCreationTimestamp="2026-03-20 07:34:00 +0000 UTC" firstStartedPulling="2026-03-20 07:34:01.268233242 +0000 UTC m=+1277.817890879" lastFinishedPulling="2026-03-20 07:34:02.122730282 +0000 UTC m=+1278.672387969" observedRunningTime="2026-03-20 07:34:02.545443701 +0000 UTC m=+1279.095101388" watchObservedRunningTime="2026-03-20 07:34:02.550739741 +0000 UTC m=+1279.100397408" Mar 20 07:34:03 crc kubenswrapper[4749]: I0320 07:34:03.540972 4749 generic.go:334] "Generic (PLEG): container finished" podID="3f631d0f-f106-44f8-93f6-6e16924f5931" containerID="e7aba9a1598ed3930860f678ea88023de80376569b971db9a3061df46e140a21" exitCode=0 Mar 20 07:34:03 crc kubenswrapper[4749]: I0320 07:34:03.541017 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566534-68w5t" event={"ID":"3f631d0f-f106-44f8-93f6-6e16924f5931","Type":"ContainerDied","Data":"e7aba9a1598ed3930860f678ea88023de80376569b971db9a3061df46e140a21"} Mar 20 07:34:04 crc kubenswrapper[4749]: I0320 07:34:04.955667 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566534-68w5t" Mar 20 07:34:05 crc kubenswrapper[4749]: I0320 07:34:05.132602 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgtwp\" (UniqueName: \"kubernetes.io/projected/3f631d0f-f106-44f8-93f6-6e16924f5931-kube-api-access-dgtwp\") pod \"3f631d0f-f106-44f8-93f6-6e16924f5931\" (UID: \"3f631d0f-f106-44f8-93f6-6e16924f5931\") " Mar 20 07:34:05 crc kubenswrapper[4749]: I0320 07:34:05.140673 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f631d0f-f106-44f8-93f6-6e16924f5931-kube-api-access-dgtwp" (OuterVolumeSpecName: "kube-api-access-dgtwp") pod "3f631d0f-f106-44f8-93f6-6e16924f5931" (UID: "3f631d0f-f106-44f8-93f6-6e16924f5931"). InnerVolumeSpecName "kube-api-access-dgtwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:34:05 crc kubenswrapper[4749]: I0320 07:34:05.235687 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgtwp\" (UniqueName: \"kubernetes.io/projected/3f631d0f-f106-44f8-93f6-6e16924f5931-kube-api-access-dgtwp\") on node \"crc\" DevicePath \"\"" Mar 20 07:34:05 crc kubenswrapper[4749]: I0320 07:34:05.561591 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566534-68w5t" event={"ID":"3f631d0f-f106-44f8-93f6-6e16924f5931","Type":"ContainerDied","Data":"a625483a2fcbd802a2cc12673df38855314a6902e38a67dd3256cb48484f53d5"} Mar 20 07:34:05 crc kubenswrapper[4749]: I0320 07:34:05.561662 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a625483a2fcbd802a2cc12673df38855314a6902e38a67dd3256cb48484f53d5" Mar 20 07:34:05 crc kubenswrapper[4749]: I0320 07:34:05.561710 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566534-68w5t" Mar 20 07:34:05 crc kubenswrapper[4749]: I0320 07:34:05.637932 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566528-wd9jp"] Mar 20 07:34:05 crc kubenswrapper[4749]: I0320 07:34:05.650648 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566528-wd9jp"] Mar 20 07:34:06 crc kubenswrapper[4749]: I0320 07:34:06.189456 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d1f5c84-c36f-40ba-b778-11bacadeb004" path="/var/lib/kubelet/pods/3d1f5c84-c36f-40ba-b778-11bacadeb004/volumes" Mar 20 07:34:07 crc kubenswrapper[4749]: I0320 07:34:07.178336 4749 scope.go:117] "RemoveContainer" containerID="abfb501d09635783023288c1b0642636c97c9d32a57499b1c998fb18bdcb12af" Mar 20 07:34:07 crc kubenswrapper[4749]: I0320 07:34:07.586033 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerStarted","Data":"9826a944fb8a4726c1c686219f60aef66aabffd2e869b91046a9e8464f6cd305"} Mar 20 07:34:07 crc kubenswrapper[4749]: I0320 07:34:07.587860 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 07:34:09 crc kubenswrapper[4749]: I0320 07:34:09.176854 4749 scope.go:117] "RemoveContainer" containerID="30d6e3a4936161683a4ed97833585ac289e397699ec7d23eb7c441c705709689" Mar 20 07:34:09 crc kubenswrapper[4749]: E0320 07:34:09.177055 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:34:11 crc kubenswrapper[4749]: I0320 07:34:11.659122 4749 generic.go:334] "Generic (PLEG): container finished" podID="8db06e36-0b00-4157-9345-69449da3e85f" containerID="9826a944fb8a4726c1c686219f60aef66aabffd2e869b91046a9e8464f6cd305" exitCode=0 Mar 20 07:34:11 crc kubenswrapper[4749]: I0320 07:34:11.659193 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerDied","Data":"9826a944fb8a4726c1c686219f60aef66aabffd2e869b91046a9e8464f6cd305"} Mar 20 07:34:11 crc kubenswrapper[4749]: I0320 07:34:11.659639 4749 scope.go:117] "RemoveContainer" containerID="abfb501d09635783023288c1b0642636c97c9d32a57499b1c998fb18bdcb12af" Mar 20 07:34:11 crc kubenswrapper[4749]: I0320 07:34:11.660052 4749 scope.go:117] "RemoveContainer" containerID="9826a944fb8a4726c1c686219f60aef66aabffd2e869b91046a9e8464f6cd305" Mar 20 07:34:11 crc kubenswrapper[4749]: E0320 07:34:11.660265 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:34:20 crc kubenswrapper[4749]: I0320 07:34:20.177671 4749 scope.go:117] "RemoveContainer" containerID="30d6e3a4936161683a4ed97833585ac289e397699ec7d23eb7c441c705709689" Mar 20 07:34:20 crc kubenswrapper[4749]: E0320 07:34:20.178447 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:34:25 crc kubenswrapper[4749]: I0320 07:34:25.177938 4749 scope.go:117] "RemoveContainer" containerID="9826a944fb8a4726c1c686219f60aef66aabffd2e869b91046a9e8464f6cd305" Mar 20 07:34:25 crc kubenswrapper[4749]: E0320 07:34:25.178582 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:34:33 crc kubenswrapper[4749]: I0320 07:34:33.177939 4749 scope.go:117] "RemoveContainer" containerID="30d6e3a4936161683a4ed97833585ac289e397699ec7d23eb7c441c705709689" Mar 20 07:34:33 crc kubenswrapper[4749]: E0320 07:34:33.178873 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:34:34 crc kubenswrapper[4749]: I0320 07:34:34.514785 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:34:34 crc kubenswrapper[4749]: I0320 07:34:34.516655 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:34:39 crc kubenswrapper[4749]: I0320 07:34:39.180356 4749 scope.go:117] "RemoveContainer" containerID="9826a944fb8a4726c1c686219f60aef66aabffd2e869b91046a9e8464f6cd305" Mar 20 07:34:39 crc kubenswrapper[4749]: E0320 07:34:39.182143 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:34:46 crc kubenswrapper[4749]: I0320 07:34:46.177884 4749 scope.go:117] "RemoveContainer" containerID="30d6e3a4936161683a4ed97833585ac289e397699ec7d23eb7c441c705709689" Mar 20 07:34:46 crc kubenswrapper[4749]: E0320 07:34:46.179142 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:34:50 crc kubenswrapper[4749]: I0320 07:34:50.177925 4749 scope.go:117] "RemoveContainer" containerID="9826a944fb8a4726c1c686219f60aef66aabffd2e869b91046a9e8464f6cd305" Mar 20 07:34:50 crc kubenswrapper[4749]: E0320 07:34:50.178790 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:34:54 crc kubenswrapper[4749]: I0320 07:34:54.489037 4749 scope.go:117] "RemoveContainer" containerID="2170aeb79ac4e0052bd067379b6b4bdabf45574632953de8ad9b56e8e580441d" Mar 20 07:34:57 crc kubenswrapper[4749]: I0320 07:34:57.179021 4749 scope.go:117] "RemoveContainer" containerID="30d6e3a4936161683a4ed97833585ac289e397699ec7d23eb7c441c705709689" Mar 20 07:34:57 crc kubenswrapper[4749]: E0320 07:34:57.179467 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:35:04 crc kubenswrapper[4749]: I0320 07:35:04.191453 4749 scope.go:117] "RemoveContainer" containerID="9826a944fb8a4726c1c686219f60aef66aabffd2e869b91046a9e8464f6cd305" Mar 20 07:35:04 crc kubenswrapper[4749]: E0320 07:35:04.191935 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:35:04 crc kubenswrapper[4749]: I0320 07:35:04.514822 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:35:04 crc kubenswrapper[4749]: I0320 07:35:04.514889 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:35:12 crc kubenswrapper[4749]: I0320 07:35:12.176827 4749 scope.go:117] "RemoveContainer" containerID="30d6e3a4936161683a4ed97833585ac289e397699ec7d23eb7c441c705709689" Mar 20 07:35:12 crc kubenswrapper[4749]: E0320 07:35:12.177707 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:35:17 crc kubenswrapper[4749]: I0320 07:35:17.177673 4749 scope.go:117] "RemoveContainer" containerID="9826a944fb8a4726c1c686219f60aef66aabffd2e869b91046a9e8464f6cd305" Mar 20 07:35:17 crc kubenswrapper[4749]: E0320 07:35:17.178394 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:35:27 crc kubenswrapper[4749]: I0320 07:35:27.177810 4749 scope.go:117] "RemoveContainer" containerID="30d6e3a4936161683a4ed97833585ac289e397699ec7d23eb7c441c705709689" Mar 20 07:35:28 crc kubenswrapper[4749]: I0320 07:35:28.372033 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerStarted","Data":"14d60c4857f48c0f7a46df5633942dd48b8898b1846bf13dd59a861469751bf0"} Mar 20 07:35:28 crc kubenswrapper[4749]: I0320 07:35:28.372789 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:35:31 crc kubenswrapper[4749]: I0320 07:35:31.177101 4749 scope.go:117] "RemoveContainer" containerID="9826a944fb8a4726c1c686219f60aef66aabffd2e869b91046a9e8464f6cd305" Mar 20 07:35:31 crc kubenswrapper[4749]: I0320 07:35:31.417243 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerStarted","Data":"ef7fc647645cb5bd965319e15caea7a582e346e4421913d237600d9c58b5377a"} Mar 20 07:35:31 crc kubenswrapper[4749]: I0320 07:35:31.417946 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 07:35:31 crc kubenswrapper[4749]: I0320 07:35:31.424342 4749 generic.go:334] "Generic (PLEG): container finished" podID="8b9b402f-2d95-48f5-98d8-497d90956ba2" containerID="14d60c4857f48c0f7a46df5633942dd48b8898b1846bf13dd59a861469751bf0" exitCode=0 Mar 20 07:35:31 crc kubenswrapper[4749]: I0320 07:35:31.424820 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerDied","Data":"14d60c4857f48c0f7a46df5633942dd48b8898b1846bf13dd59a861469751bf0"} Mar 20 07:35:31 crc kubenswrapper[4749]: I0320 07:35:31.424899 4749 scope.go:117] "RemoveContainer" containerID="30d6e3a4936161683a4ed97833585ac289e397699ec7d23eb7c441c705709689" Mar 20 07:35:31 crc kubenswrapper[4749]: I0320 07:35:31.425570 4749 scope.go:117] "RemoveContainer" containerID="14d60c4857f48c0f7a46df5633942dd48b8898b1846bf13dd59a861469751bf0" Mar 20 07:35:31 crc kubenswrapper[4749]: E0320 07:35:31.425847 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:35:34 crc kubenswrapper[4749]: I0320 07:35:34.514825 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:35:34 crc kubenswrapper[4749]: I0320 07:35:34.515139 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:35:34 crc kubenswrapper[4749]: I0320 07:35:34.515183 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:35:34 crc kubenswrapper[4749]: I0320 07:35:34.515953 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"91c7008ee23efd7c3f17163220decd2750e56886d85eb31073250a59b5bc0138"} pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 07:35:34 crc kubenswrapper[4749]: I0320 07:35:34.516024 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" containerID="cri-o://91c7008ee23efd7c3f17163220decd2750e56886d85eb31073250a59b5bc0138" gracePeriod=600 Mar 20 07:35:35 crc kubenswrapper[4749]: I0320 07:35:35.470363 4749 generic.go:334] "Generic (PLEG): container finished" podID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerID="91c7008ee23efd7c3f17163220decd2750e56886d85eb31073250a59b5bc0138" exitCode=0 Mar 20 07:35:35 crc kubenswrapper[4749]: I0320 07:35:35.470431 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerDied","Data":"91c7008ee23efd7c3f17163220decd2750e56886d85eb31073250a59b5bc0138"} Mar 20 07:35:35 crc kubenswrapper[4749]: I0320 07:35:35.470825 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerStarted","Data":"4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563"} Mar 20 07:35:35 crc kubenswrapper[4749]: I0320 07:35:35.470852 4749 scope.go:117] "RemoveContainer" containerID="72a24d5f0786b3da9aac01d553c981fdcf13ebc1b2358317a489547c93d570db" Mar 20 07:35:35 crc kubenswrapper[4749]: I0320 07:35:35.474766 4749 generic.go:334] "Generic (PLEG): container finished" podID="8db06e36-0b00-4157-9345-69449da3e85f" containerID="ef7fc647645cb5bd965319e15caea7a582e346e4421913d237600d9c58b5377a" exitCode=0 Mar 20 07:35:35 crc kubenswrapper[4749]: I0320 07:35:35.474816 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerDied","Data":"ef7fc647645cb5bd965319e15caea7a582e346e4421913d237600d9c58b5377a"} Mar 20 07:35:35 crc kubenswrapper[4749]: I0320 07:35:35.475673 4749 scope.go:117] "RemoveContainer" containerID="ef7fc647645cb5bd965319e15caea7a582e346e4421913d237600d9c58b5377a" Mar 20 07:35:35 crc kubenswrapper[4749]: E0320 07:35:35.475969 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:35:35 crc kubenswrapper[4749]: I0320 07:35:35.504826 4749 scope.go:117] "RemoveContainer" containerID="9826a944fb8a4726c1c686219f60aef66aabffd2e869b91046a9e8464f6cd305" Mar 20 07:35:45 crc kubenswrapper[4749]: I0320 07:35:45.177749 4749 scope.go:117] "RemoveContainer" containerID="14d60c4857f48c0f7a46df5633942dd48b8898b1846bf13dd59a861469751bf0" Mar 20 07:35:45 crc kubenswrapper[4749]: E0320 07:35:45.179625 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:35:47 crc kubenswrapper[4749]: I0320 07:35:47.178056 4749 scope.go:117] "RemoveContainer" containerID="ef7fc647645cb5bd965319e15caea7a582e346e4421913d237600d9c58b5377a" Mar 20 07:35:47 crc kubenswrapper[4749]: E0320 07:35:47.178868 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:35:58 crc kubenswrapper[4749]: I0320 07:35:58.177577 4749 scope.go:117] "RemoveContainer" containerID="14d60c4857f48c0f7a46df5633942dd48b8898b1846bf13dd59a861469751bf0" Mar 20 07:35:58 crc kubenswrapper[4749]: E0320 07:35:58.178222 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:36:00 crc kubenswrapper[4749]: I0320 07:36:00.145196 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566536-v9tvd"] Mar 20 07:36:00 crc kubenswrapper[4749]: E0320 07:36:00.154242 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f631d0f-f106-44f8-93f6-6e16924f5931" containerName="oc" Mar 20 07:36:00 crc kubenswrapper[4749]: I0320 07:36:00.154303 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f631d0f-f106-44f8-93f6-6e16924f5931" containerName="oc" Mar 20 07:36:00 crc kubenswrapper[4749]: I0320 07:36:00.154515 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f631d0f-f106-44f8-93f6-6e16924f5931" containerName="oc" Mar 20 07:36:00 crc kubenswrapper[4749]: I0320 07:36:00.155239 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566536-v9tvd"] Mar 20 07:36:00 crc kubenswrapper[4749]: I0320 07:36:00.155381 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566536-v9tvd" Mar 20 07:36:00 crc kubenswrapper[4749]: I0320 07:36:00.189151 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:36:00 crc kubenswrapper[4749]: I0320 07:36:00.189649 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 07:36:00 crc kubenswrapper[4749]: I0320 07:36:00.190948 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:36:00 crc kubenswrapper[4749]: I0320 07:36:00.263757 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dln7t\" (UniqueName: \"kubernetes.io/projected/c0ec3a89-57c0-46eb-b6fe-64cef0b74782-kube-api-access-dln7t\") pod \"auto-csr-approver-29566536-v9tvd\" (UID: \"c0ec3a89-57c0-46eb-b6fe-64cef0b74782\") " pod="openshift-infra/auto-csr-approver-29566536-v9tvd" Mar 20 07:36:00 crc kubenswrapper[4749]: I0320 07:36:00.365189 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dln7t\" (UniqueName: \"kubernetes.io/projected/c0ec3a89-57c0-46eb-b6fe-64cef0b74782-kube-api-access-dln7t\") pod \"auto-csr-approver-29566536-v9tvd\" (UID: \"c0ec3a89-57c0-46eb-b6fe-64cef0b74782\") " pod="openshift-infra/auto-csr-approver-29566536-v9tvd" Mar 20 07:36:00 crc kubenswrapper[4749]: I0320 07:36:00.401142 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dln7t\" (UniqueName: \"kubernetes.io/projected/c0ec3a89-57c0-46eb-b6fe-64cef0b74782-kube-api-access-dln7t\") pod \"auto-csr-approver-29566536-v9tvd\" (UID: \"c0ec3a89-57c0-46eb-b6fe-64cef0b74782\") " pod="openshift-infra/auto-csr-approver-29566536-v9tvd" Mar 20 07:36:00 crc kubenswrapper[4749]: I0320 07:36:00.515477 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566536-v9tvd" Mar 20 07:36:00 crc kubenswrapper[4749]: I0320 07:36:00.945094 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566536-v9tvd"] Mar 20 07:36:00 crc kubenswrapper[4749]: W0320 07:36:00.946117 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0ec3a89_57c0_46eb_b6fe_64cef0b74782.slice/crio-7495a448bff30edcabec088b7fd754e41f1c012d736e5911d545ec1ae808591b WatchSource:0}: Error finding container 7495a448bff30edcabec088b7fd754e41f1c012d736e5911d545ec1ae808591b: Status 404 returned error can't find the container with id 7495a448bff30edcabec088b7fd754e41f1c012d736e5911d545ec1ae808591b Mar 20 07:36:01 crc kubenswrapper[4749]: I0320 07:36:01.765098 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566536-v9tvd" event={"ID":"c0ec3a89-57c0-46eb-b6fe-64cef0b74782","Type":"ContainerStarted","Data":"7495a448bff30edcabec088b7fd754e41f1c012d736e5911d545ec1ae808591b"} Mar 20 07:36:02 crc kubenswrapper[4749]: I0320 07:36:02.177498 4749 scope.go:117] "RemoveContainer" containerID="ef7fc647645cb5bd965319e15caea7a582e346e4421913d237600d9c58b5377a" Mar 20 07:36:02 crc kubenswrapper[4749]: E0320 07:36:02.178051 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:36:02 crc kubenswrapper[4749]: I0320 07:36:02.777676 4749 generic.go:334] "Generic (PLEG): container finished" podID="c0ec3a89-57c0-46eb-b6fe-64cef0b74782" containerID="03274338217e796f9048d6fd52cdf22751bb228c4e948da0fefd07fe76d1a02b" exitCode=0 Mar 20 07:36:02 crc kubenswrapper[4749]: I0320 07:36:02.777738 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566536-v9tvd" event={"ID":"c0ec3a89-57c0-46eb-b6fe-64cef0b74782","Type":"ContainerDied","Data":"03274338217e796f9048d6fd52cdf22751bb228c4e948da0fefd07fe76d1a02b"} Mar 20 07:36:04 crc kubenswrapper[4749]: I0320 07:36:04.069505 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566536-v9tvd" Mar 20 07:36:04 crc kubenswrapper[4749]: I0320 07:36:04.129672 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dln7t\" (UniqueName: \"kubernetes.io/projected/c0ec3a89-57c0-46eb-b6fe-64cef0b74782-kube-api-access-dln7t\") pod \"c0ec3a89-57c0-46eb-b6fe-64cef0b74782\" (UID: \"c0ec3a89-57c0-46eb-b6fe-64cef0b74782\") " Mar 20 07:36:04 crc kubenswrapper[4749]: I0320 07:36:04.135889 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0ec3a89-57c0-46eb-b6fe-64cef0b74782-kube-api-access-dln7t" (OuterVolumeSpecName: "kube-api-access-dln7t") pod "c0ec3a89-57c0-46eb-b6fe-64cef0b74782" (UID: "c0ec3a89-57c0-46eb-b6fe-64cef0b74782"). InnerVolumeSpecName "kube-api-access-dln7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:36:04 crc kubenswrapper[4749]: I0320 07:36:04.232006 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dln7t\" (UniqueName: \"kubernetes.io/projected/c0ec3a89-57c0-46eb-b6fe-64cef0b74782-kube-api-access-dln7t\") on node \"crc\" DevicePath \"\"" Mar 20 07:36:04 crc kubenswrapper[4749]: I0320 07:36:04.796323 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566536-v9tvd" event={"ID":"c0ec3a89-57c0-46eb-b6fe-64cef0b74782","Type":"ContainerDied","Data":"7495a448bff30edcabec088b7fd754e41f1c012d736e5911d545ec1ae808591b"} Mar 20 07:36:04 crc kubenswrapper[4749]: I0320 07:36:04.796371 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7495a448bff30edcabec088b7fd754e41f1c012d736e5911d545ec1ae808591b" Mar 20 07:36:04 crc kubenswrapper[4749]: I0320 07:36:04.796381 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566536-v9tvd" Mar 20 07:36:05 crc kubenswrapper[4749]: I0320 07:36:05.138961 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566530-j4mbn"] Mar 20 07:36:05 crc kubenswrapper[4749]: I0320 07:36:05.146731 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566530-j4mbn"] Mar 20 07:36:06 crc kubenswrapper[4749]: I0320 07:36:06.201846 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98099ee0-eea2-4964-82cd-f8a255f33811" path="/var/lib/kubelet/pods/98099ee0-eea2-4964-82cd-f8a255f33811/volumes" Mar 20 07:36:11 crc kubenswrapper[4749]: I0320 07:36:11.177749 4749 scope.go:117] "RemoveContainer" containerID="14d60c4857f48c0f7a46df5633942dd48b8898b1846bf13dd59a861469751bf0" Mar 20 07:36:11 crc kubenswrapper[4749]: E0320 07:36:11.178272 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:36:14 crc kubenswrapper[4749]: I0320 07:36:14.182437 4749 scope.go:117] "RemoveContainer" containerID="ef7fc647645cb5bd965319e15caea7a582e346e4421913d237600d9c58b5377a" Mar 20 07:36:14 crc kubenswrapper[4749]: E0320 07:36:14.183408 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:36:22 crc kubenswrapper[4749]: I0320 07:36:22.177601 4749 scope.go:117] "RemoveContainer" containerID="14d60c4857f48c0f7a46df5633942dd48b8898b1846bf13dd59a861469751bf0" Mar 20 07:36:22 crc kubenswrapper[4749]: E0320 07:36:22.178422 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:36:25 crc kubenswrapper[4749]: I0320 07:36:25.177636 4749 scope.go:117] "RemoveContainer" containerID="ef7fc647645cb5bd965319e15caea7a582e346e4421913d237600d9c58b5377a" Mar 20 07:36:25 crc kubenswrapper[4749]: E0320 07:36:25.178123 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:36:36 crc kubenswrapper[4749]: I0320 07:36:36.178384 4749 scope.go:117] "RemoveContainer" containerID="14d60c4857f48c0f7a46df5633942dd48b8898b1846bf13dd59a861469751bf0" Mar 20 07:36:36 crc kubenswrapper[4749]: E0320 07:36:36.179330 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:36:40 crc kubenswrapper[4749]: I0320 07:36:40.178625 4749 scope.go:117] "RemoveContainer" containerID="ef7fc647645cb5bd965319e15caea7a582e346e4421913d237600d9c58b5377a" Mar 20 07:36:40 crc kubenswrapper[4749]: E0320 07:36:40.179835 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:36:50 crc kubenswrapper[4749]: I0320 07:36:50.178187 4749 scope.go:117] "RemoveContainer" containerID="14d60c4857f48c0f7a46df5633942dd48b8898b1846bf13dd59a861469751bf0" Mar 20 07:36:50 crc kubenswrapper[4749]: E0320 07:36:50.179277 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:36:50 crc kubenswrapper[4749]: I0320 07:36:50.807711 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xt7dv"] Mar 20 07:36:50 crc kubenswrapper[4749]: E0320 07:36:50.815470 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0ec3a89-57c0-46eb-b6fe-64cef0b74782" containerName="oc" Mar 20 07:36:50 crc kubenswrapper[4749]: I0320 07:36:50.815522 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0ec3a89-57c0-46eb-b6fe-64cef0b74782" containerName="oc" Mar 20 07:36:50 crc kubenswrapper[4749]: I0320 07:36:50.815756 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0ec3a89-57c0-46eb-b6fe-64cef0b74782" containerName="oc" Mar 20 07:36:50 crc kubenswrapper[4749]: I0320 07:36:50.816852 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xt7dv" Mar 20 07:36:50 crc kubenswrapper[4749]: I0320 07:36:50.832903 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xt7dv"] Mar 20 07:36:50 crc kubenswrapper[4749]: I0320 07:36:50.997209 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-788xc\" (UniqueName: \"kubernetes.io/projected/7b79e720-5ef3-4518-9593-f3fbd622aed4-kube-api-access-788xc\") pod \"redhat-operators-xt7dv\" (UID: \"7b79e720-5ef3-4518-9593-f3fbd622aed4\") " pod="openshift-marketplace/redhat-operators-xt7dv" Mar 20 07:36:50 crc kubenswrapper[4749]: I0320 07:36:50.997338 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b79e720-5ef3-4518-9593-f3fbd622aed4-utilities\") pod \"redhat-operators-xt7dv\" (UID: \"7b79e720-5ef3-4518-9593-f3fbd622aed4\") " pod="openshift-marketplace/redhat-operators-xt7dv" Mar 20 07:36:50 crc kubenswrapper[4749]: I0320 07:36:50.997429 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b79e720-5ef3-4518-9593-f3fbd622aed4-catalog-content\") pod \"redhat-operators-xt7dv\" (UID: \"7b79e720-5ef3-4518-9593-f3fbd622aed4\") " pod="openshift-marketplace/redhat-operators-xt7dv" Mar 20 07:36:51 crc kubenswrapper[4749]: I0320 07:36:51.099091 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b79e720-5ef3-4518-9593-f3fbd622aed4-utilities\") pod \"redhat-operators-xt7dv\" (UID: \"7b79e720-5ef3-4518-9593-f3fbd622aed4\") " pod="openshift-marketplace/redhat-operators-xt7dv" Mar 20 07:36:51 crc kubenswrapper[4749]: I0320 07:36:51.099186 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b79e720-5ef3-4518-9593-f3fbd622aed4-catalog-content\") pod \"redhat-operators-xt7dv\" (UID: \"7b79e720-5ef3-4518-9593-f3fbd622aed4\") " pod="openshift-marketplace/redhat-operators-xt7dv" Mar 20 07:36:51 crc kubenswrapper[4749]: I0320 07:36:51.099239 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-788xc\" (UniqueName: \"kubernetes.io/projected/7b79e720-5ef3-4518-9593-f3fbd622aed4-kube-api-access-788xc\") pod \"redhat-operators-xt7dv\" (UID: \"7b79e720-5ef3-4518-9593-f3fbd622aed4\") " pod="openshift-marketplace/redhat-operators-xt7dv" Mar 20 07:36:51 crc kubenswrapper[4749]: I0320 07:36:51.099765 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b79e720-5ef3-4518-9593-f3fbd622aed4-utilities\") pod \"redhat-operators-xt7dv\" (UID: \"7b79e720-5ef3-4518-9593-f3fbd622aed4\") " pod="openshift-marketplace/redhat-operators-xt7dv" Mar 20 07:36:51 crc kubenswrapper[4749]: I0320 07:36:51.099825 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b79e720-5ef3-4518-9593-f3fbd622aed4-catalog-content\") pod \"redhat-operators-xt7dv\" (UID: \"7b79e720-5ef3-4518-9593-f3fbd622aed4\") " pod="openshift-marketplace/redhat-operators-xt7dv" Mar 20 07:36:51 crc kubenswrapper[4749]: I0320 07:36:51.117440 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-788xc\" (UniqueName: \"kubernetes.io/projected/7b79e720-5ef3-4518-9593-f3fbd622aed4-kube-api-access-788xc\") pod \"redhat-operators-xt7dv\" (UID: \"7b79e720-5ef3-4518-9593-f3fbd622aed4\") " pod="openshift-marketplace/redhat-operators-xt7dv" Mar 20 07:36:51 crc kubenswrapper[4749]: I0320 07:36:51.142959 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xt7dv" Mar 20 07:36:51 crc kubenswrapper[4749]: I0320 07:36:51.593662 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xt7dv"] Mar 20 07:36:52 crc kubenswrapper[4749]: I0320 07:36:52.245608 4749 generic.go:334] "Generic (PLEG): container finished" podID="7b79e720-5ef3-4518-9593-f3fbd622aed4" containerID="778fc80a5f856513572e7eaf5027641a964eeade5cdace34886d6a0771d4abb1" exitCode=0 Mar 20 07:36:52 crc kubenswrapper[4749]: I0320 07:36:52.245686 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xt7dv" event={"ID":"7b79e720-5ef3-4518-9593-f3fbd622aed4","Type":"ContainerDied","Data":"778fc80a5f856513572e7eaf5027641a964eeade5cdace34886d6a0771d4abb1"} Mar 20 07:36:52 crc kubenswrapper[4749]: I0320 07:36:52.245934 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xt7dv" event={"ID":"7b79e720-5ef3-4518-9593-f3fbd622aed4","Type":"ContainerStarted","Data":"174b41e2a82707a0968da3586cfea2ef69219bf97d1dd1a8e2b11266258094e7"} Mar 20 07:36:53 crc kubenswrapper[4749]: I0320 07:36:53.178234 4749 scope.go:117] "RemoveContainer" containerID="ef7fc647645cb5bd965319e15caea7a582e346e4421913d237600d9c58b5377a" Mar 20 07:36:53 crc kubenswrapper[4749]: E0320 07:36:53.178765 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:36:53 crc kubenswrapper[4749]: I0320 07:36:53.255262 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xt7dv" event={"ID":"7b79e720-5ef3-4518-9593-f3fbd622aed4","Type":"ContainerStarted","Data":"2aa907946815ab264e528b9082bb19f895a169a98c48a18a679ef04e03ec2570"} Mar 20 07:36:54 crc kubenswrapper[4749]: I0320 07:36:54.627129 4749 scope.go:117] "RemoveContainer" containerID="b269f05024d71e0720b806b8519a352e516cecac758dd3274ceb2a90f86cd520" Mar 20 07:36:55 crc kubenswrapper[4749]: I0320 07:36:55.279313 4749 generic.go:334] "Generic (PLEG): container finished" podID="7b79e720-5ef3-4518-9593-f3fbd622aed4" containerID="2aa907946815ab264e528b9082bb19f895a169a98c48a18a679ef04e03ec2570" exitCode=0 Mar 20 07:36:55 crc kubenswrapper[4749]: I0320 07:36:55.279366 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xt7dv" event={"ID":"7b79e720-5ef3-4518-9593-f3fbd622aed4","Type":"ContainerDied","Data":"2aa907946815ab264e528b9082bb19f895a169a98c48a18a679ef04e03ec2570"} Mar 20 07:36:56 crc kubenswrapper[4749]: I0320 07:36:56.293751 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xt7dv" event={"ID":"7b79e720-5ef3-4518-9593-f3fbd622aed4","Type":"ContainerStarted","Data":"6f2acbeb2c30cb2e00c9529588bf93b4020dfa7a4c75a25da79aa79eaae0308d"} Mar 20 07:36:56 crc kubenswrapper[4749]: I0320 07:36:56.321548 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xt7dv" podStartSLOduration=2.7301004730000002 podStartE2EDuration="6.321527948s" podCreationTimestamp="2026-03-20 07:36:50 +0000 UTC" firstStartedPulling="2026-03-20 07:36:52.247465462 +0000 UTC m=+1448.797123119" lastFinishedPulling="2026-03-20 07:36:55.838892917 +0000 UTC m=+1452.388550594" observedRunningTime="2026-03-20 07:36:56.318668358 +0000 UTC m=+1452.868326015" watchObservedRunningTime="2026-03-20 07:36:56.321527948 +0000 UTC m=+1452.871185605" Mar 20 07:37:01 crc kubenswrapper[4749]: I0320 07:37:01.144323 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xt7dv" Mar 20 07:37:01 crc kubenswrapper[4749]: I0320 07:37:01.144906 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xt7dv" Mar 20 07:37:02 crc kubenswrapper[4749]: I0320 07:37:02.178335 4749 scope.go:117] "RemoveContainer" containerID="14d60c4857f48c0f7a46df5633942dd48b8898b1846bf13dd59a861469751bf0" Mar 20 07:37:02 crc kubenswrapper[4749]: E0320 07:37:02.178740 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:37:02 crc kubenswrapper[4749]: I0320 07:37:02.198777 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xt7dv" podUID="7b79e720-5ef3-4518-9593-f3fbd622aed4" containerName="registry-server" probeResult="failure" output=< Mar 20 07:37:02 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Mar 20 07:37:02 crc kubenswrapper[4749]: > Mar 20 07:37:05 crc kubenswrapper[4749]: I0320 07:37:05.177647 4749 scope.go:117] "RemoveContainer" containerID="ef7fc647645cb5bd965319e15caea7a582e346e4421913d237600d9c58b5377a" Mar 20 07:37:05 crc kubenswrapper[4749]: E0320 07:37:05.178437 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:37:11 crc kubenswrapper[4749]: I0320 07:37:11.225959 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xt7dv" Mar 20 07:37:11 crc kubenswrapper[4749]: I0320 07:37:11.279606 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xt7dv" Mar 20 07:37:11 crc kubenswrapper[4749]: I0320 07:37:11.474174 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xt7dv"] Mar 20 07:37:12 crc kubenswrapper[4749]: I0320 07:37:12.441190 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xt7dv" podUID="7b79e720-5ef3-4518-9593-f3fbd622aed4" containerName="registry-server" containerID="cri-o://6f2acbeb2c30cb2e00c9529588bf93b4020dfa7a4c75a25da79aa79eaae0308d" gracePeriod=2 Mar 20 07:37:12 crc kubenswrapper[4749]: I0320 07:37:12.886537 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xt7dv" Mar 20 07:37:12 crc kubenswrapper[4749]: I0320 07:37:12.974874 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b79e720-5ef3-4518-9593-f3fbd622aed4-catalog-content\") pod \"7b79e720-5ef3-4518-9593-f3fbd622aed4\" (UID: \"7b79e720-5ef3-4518-9593-f3fbd622aed4\") " Mar 20 07:37:12 crc kubenswrapper[4749]: I0320 07:37:12.974945 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b79e720-5ef3-4518-9593-f3fbd622aed4-utilities\") pod \"7b79e720-5ef3-4518-9593-f3fbd622aed4\" (UID: \"7b79e720-5ef3-4518-9593-f3fbd622aed4\") " Mar 20 07:37:12 crc kubenswrapper[4749]: I0320 07:37:12.975044 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-788xc\" (UniqueName: \"kubernetes.io/projected/7b79e720-5ef3-4518-9593-f3fbd622aed4-kube-api-access-788xc\") pod \"7b79e720-5ef3-4518-9593-f3fbd622aed4\" (UID: \"7b79e720-5ef3-4518-9593-f3fbd622aed4\") " Mar 20 07:37:12 crc kubenswrapper[4749]: I0320 07:37:12.976158 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b79e720-5ef3-4518-9593-f3fbd622aed4-utilities" (OuterVolumeSpecName: "utilities") pod "7b79e720-5ef3-4518-9593-f3fbd622aed4" (UID: "7b79e720-5ef3-4518-9593-f3fbd622aed4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:37:12 crc kubenswrapper[4749]: I0320 07:37:12.984964 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b79e720-5ef3-4518-9593-f3fbd622aed4-kube-api-access-788xc" (OuterVolumeSpecName: "kube-api-access-788xc") pod "7b79e720-5ef3-4518-9593-f3fbd622aed4" (UID: "7b79e720-5ef3-4518-9593-f3fbd622aed4"). InnerVolumeSpecName "kube-api-access-788xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:37:13 crc kubenswrapper[4749]: I0320 07:37:13.076874 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-788xc\" (UniqueName: \"kubernetes.io/projected/7b79e720-5ef3-4518-9593-f3fbd622aed4-kube-api-access-788xc\") on node \"crc\" DevicePath \"\"" Mar 20 07:37:13 crc kubenswrapper[4749]: I0320 07:37:13.076908 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b79e720-5ef3-4518-9593-f3fbd622aed4-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:37:13 crc kubenswrapper[4749]: I0320 07:37:13.138551 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b79e720-5ef3-4518-9593-f3fbd622aed4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b79e720-5ef3-4518-9593-f3fbd622aed4" (UID: "7b79e720-5ef3-4518-9593-f3fbd622aed4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:37:13 crc kubenswrapper[4749]: I0320 07:37:13.178073 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b79e720-5ef3-4518-9593-f3fbd622aed4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:37:13 crc kubenswrapper[4749]: I0320 07:37:13.451321 4749 generic.go:334] "Generic (PLEG): container finished" podID="7b79e720-5ef3-4518-9593-f3fbd622aed4" containerID="6f2acbeb2c30cb2e00c9529588bf93b4020dfa7a4c75a25da79aa79eaae0308d" exitCode=0 Mar 20 07:37:13 crc kubenswrapper[4749]: I0320 07:37:13.451375 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xt7dv" event={"ID":"7b79e720-5ef3-4518-9593-f3fbd622aed4","Type":"ContainerDied","Data":"6f2acbeb2c30cb2e00c9529588bf93b4020dfa7a4c75a25da79aa79eaae0308d"} Mar 20 07:37:13 crc kubenswrapper[4749]: I0320 07:37:13.451429 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xt7dv" event={"ID":"7b79e720-5ef3-4518-9593-f3fbd622aed4","Type":"ContainerDied","Data":"174b41e2a82707a0968da3586cfea2ef69219bf97d1dd1a8e2b11266258094e7"} Mar 20 07:37:13 crc kubenswrapper[4749]: I0320 07:37:13.451449 4749 scope.go:117] "RemoveContainer" containerID="6f2acbeb2c30cb2e00c9529588bf93b4020dfa7a4c75a25da79aa79eaae0308d" Mar 20 07:37:13 crc kubenswrapper[4749]: I0320 07:37:13.451463 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xt7dv" Mar 20 07:37:13 crc kubenswrapper[4749]: I0320 07:37:13.469372 4749 scope.go:117] "RemoveContainer" containerID="2aa907946815ab264e528b9082bb19f895a169a98c48a18a679ef04e03ec2570" Mar 20 07:37:13 crc kubenswrapper[4749]: I0320 07:37:13.497104 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xt7dv"] Mar 20 07:37:13 crc kubenswrapper[4749]: I0320 07:37:13.507367 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xt7dv"] Mar 20 07:37:13 crc kubenswrapper[4749]: I0320 07:37:13.508129 4749 scope.go:117] "RemoveContainer" containerID="778fc80a5f856513572e7eaf5027641a964eeade5cdace34886d6a0771d4abb1" Mar 20 07:37:13 crc kubenswrapper[4749]: I0320 07:37:13.526944 4749 scope.go:117] "RemoveContainer" containerID="6f2acbeb2c30cb2e00c9529588bf93b4020dfa7a4c75a25da79aa79eaae0308d" Mar 20 07:37:13 crc kubenswrapper[4749]: E0320 07:37:13.527431 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f2acbeb2c30cb2e00c9529588bf93b4020dfa7a4c75a25da79aa79eaae0308d\": container with ID starting with 6f2acbeb2c30cb2e00c9529588bf93b4020dfa7a4c75a25da79aa79eaae0308d not found: ID does not exist" containerID="6f2acbeb2c30cb2e00c9529588bf93b4020dfa7a4c75a25da79aa79eaae0308d" Mar 20 07:37:13 crc kubenswrapper[4749]: I0320 07:37:13.527492 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f2acbeb2c30cb2e00c9529588bf93b4020dfa7a4c75a25da79aa79eaae0308d"} err="failed to get container status \"6f2acbeb2c30cb2e00c9529588bf93b4020dfa7a4c75a25da79aa79eaae0308d\": rpc error: code = NotFound desc = could not find container \"6f2acbeb2c30cb2e00c9529588bf93b4020dfa7a4c75a25da79aa79eaae0308d\": container with ID starting with 6f2acbeb2c30cb2e00c9529588bf93b4020dfa7a4c75a25da79aa79eaae0308d not found: ID does not exist" Mar 20 07:37:13 crc kubenswrapper[4749]: I0320 07:37:13.527517 4749 scope.go:117] "RemoveContainer" containerID="2aa907946815ab264e528b9082bb19f895a169a98c48a18a679ef04e03ec2570" Mar 20 07:37:13 crc kubenswrapper[4749]: E0320 07:37:13.527814 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2aa907946815ab264e528b9082bb19f895a169a98c48a18a679ef04e03ec2570\": container with ID starting with 2aa907946815ab264e528b9082bb19f895a169a98c48a18a679ef04e03ec2570 not found: ID does not exist" containerID="2aa907946815ab264e528b9082bb19f895a169a98c48a18a679ef04e03ec2570" Mar 20 07:37:13 crc kubenswrapper[4749]: I0320 07:37:13.527852 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aa907946815ab264e528b9082bb19f895a169a98c48a18a679ef04e03ec2570"} err="failed to get container status \"2aa907946815ab264e528b9082bb19f895a169a98c48a18a679ef04e03ec2570\": rpc error: code = NotFound desc = could not find container \"2aa907946815ab264e528b9082bb19f895a169a98c48a18a679ef04e03ec2570\": container with ID starting with 2aa907946815ab264e528b9082bb19f895a169a98c48a18a679ef04e03ec2570 not found: ID does not exist" Mar 20 07:37:13 crc kubenswrapper[4749]: I0320 07:37:13.527889 4749 scope.go:117] "RemoveContainer" containerID="778fc80a5f856513572e7eaf5027641a964eeade5cdace34886d6a0771d4abb1" Mar 20 07:37:13 crc kubenswrapper[4749]: E0320 07:37:13.528380 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"778fc80a5f856513572e7eaf5027641a964eeade5cdace34886d6a0771d4abb1\": container with ID starting with 778fc80a5f856513572e7eaf5027641a964eeade5cdace34886d6a0771d4abb1 not found: ID does not exist" containerID="778fc80a5f856513572e7eaf5027641a964eeade5cdace34886d6a0771d4abb1" Mar 20 07:37:13 crc kubenswrapper[4749]: I0320 07:37:13.528445 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"778fc80a5f856513572e7eaf5027641a964eeade5cdace34886d6a0771d4abb1"} err="failed to get container status \"778fc80a5f856513572e7eaf5027641a964eeade5cdace34886d6a0771d4abb1\": rpc error: code = NotFound desc = could not find container \"778fc80a5f856513572e7eaf5027641a964eeade5cdace34886d6a0771d4abb1\": container with ID starting with 778fc80a5f856513572e7eaf5027641a964eeade5cdace34886d6a0771d4abb1 not found: ID does not exist" Mar 20 07:37:14 crc kubenswrapper[4749]: I0320 07:37:14.187484 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b79e720-5ef3-4518-9593-f3fbd622aed4" path="/var/lib/kubelet/pods/7b79e720-5ef3-4518-9593-f3fbd622aed4/volumes" Mar 20 07:37:16 crc kubenswrapper[4749]: I0320 07:37:16.177480 4749 scope.go:117] "RemoveContainer" containerID="ef7fc647645cb5bd965319e15caea7a582e346e4421913d237600d9c58b5377a" Mar 20 07:37:16 crc kubenswrapper[4749]: E0320 07:37:16.178050 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:37:17 crc kubenswrapper[4749]: I0320 07:37:17.177554 4749 scope.go:117] "RemoveContainer" containerID="14d60c4857f48c0f7a46df5633942dd48b8898b1846bf13dd59a861469751bf0" Mar 20 07:37:17 crc kubenswrapper[4749]: E0320 07:37:17.177904 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:37:29 crc kubenswrapper[4749]: I0320 07:37:29.178260 4749 scope.go:117] "RemoveContainer" containerID="14d60c4857f48c0f7a46df5633942dd48b8898b1846bf13dd59a861469751bf0" Mar 20 07:37:29 crc kubenswrapper[4749]: E0320 07:37:29.179351 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:37:30 crc kubenswrapper[4749]: I0320 07:37:30.178543 4749 scope.go:117] "RemoveContainer" containerID="ef7fc647645cb5bd965319e15caea7a582e346e4421913d237600d9c58b5377a" Mar 20 07:37:30 crc kubenswrapper[4749]: E0320 07:37:30.179397 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:37:34 crc kubenswrapper[4749]: I0320 07:37:34.514979 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:37:34 crc kubenswrapper[4749]: I0320 07:37:34.515551 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:37:42 crc kubenswrapper[4749]: I0320 07:37:42.178049 4749 scope.go:117] "RemoveContainer" containerID="14d60c4857f48c0f7a46df5633942dd48b8898b1846bf13dd59a861469751bf0" Mar 20 07:37:42 crc kubenswrapper[4749]: E0320 07:37:42.179123 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:37:43 crc kubenswrapper[4749]: I0320 07:37:43.177661 4749 scope.go:117] "RemoveContainer" containerID="ef7fc647645cb5bd965319e15caea7a582e346e4421913d237600d9c58b5377a" Mar 20 07:37:43 crc kubenswrapper[4749]: E0320 07:37:43.178276 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:37:55 crc kubenswrapper[4749]: I0320 07:37:55.177271 4749 scope.go:117] "RemoveContainer" containerID="ef7fc647645cb5bd965319e15caea7a582e346e4421913d237600d9c58b5377a" Mar 20 07:37:55 crc kubenswrapper[4749]: E0320 07:37:55.178546 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:37:56 crc kubenswrapper[4749]: I0320 07:37:56.177573 4749 scope.go:117] "RemoveContainer" containerID="14d60c4857f48c0f7a46df5633942dd48b8898b1846bf13dd59a861469751bf0" Mar 20 07:37:56 crc kubenswrapper[4749]: E0320 07:37:56.178037 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:38:00 crc kubenswrapper[4749]: I0320 07:38:00.167148 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566538-g7qxx"] Mar 20 07:38:00 crc kubenswrapper[4749]: E0320 07:38:00.168364 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b79e720-5ef3-4518-9593-f3fbd622aed4" containerName="registry-server" Mar 20 07:38:00 crc kubenswrapper[4749]: I0320 07:38:00.168386 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b79e720-5ef3-4518-9593-f3fbd622aed4" containerName="registry-server" Mar 20 07:38:00 crc kubenswrapper[4749]: E0320 07:38:00.168411 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b79e720-5ef3-4518-9593-f3fbd622aed4" containerName="extract-utilities" Mar 20 07:38:00 crc kubenswrapper[4749]: I0320 07:38:00.168424 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b79e720-5ef3-4518-9593-f3fbd622aed4" containerName="extract-utilities" Mar 20 07:38:00 crc kubenswrapper[4749]: E0320 07:38:00.168458 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b79e720-5ef3-4518-9593-f3fbd622aed4" containerName="extract-content" Mar 20 07:38:00 crc kubenswrapper[4749]: I0320 07:38:00.168472 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b79e720-5ef3-4518-9593-f3fbd622aed4" containerName="extract-content" Mar 20 07:38:00 crc kubenswrapper[4749]: I0320 07:38:00.168806 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b79e720-5ef3-4518-9593-f3fbd622aed4" containerName="registry-server" Mar 20 07:38:00 crc kubenswrapper[4749]: I0320 07:38:00.169706 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566538-g7qxx" Mar 20 07:38:00 crc kubenswrapper[4749]: I0320 07:38:00.175368 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 07:38:00 crc kubenswrapper[4749]: I0320 07:38:00.175405 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:38:00 crc kubenswrapper[4749]: I0320 07:38:00.175663 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:38:00 crc kubenswrapper[4749]: I0320 07:38:00.209535 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566538-g7qxx"] Mar 20 07:38:00 crc kubenswrapper[4749]: I0320 07:38:00.300046 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bzlr\" (UniqueName: \"kubernetes.io/projected/d8804ec1-0bcd-42e0-bd6c-63af36a81efb-kube-api-access-2bzlr\") pod \"auto-csr-approver-29566538-g7qxx\" (UID: \"d8804ec1-0bcd-42e0-bd6c-63af36a81efb\") " pod="openshift-infra/auto-csr-approver-29566538-g7qxx" Mar 20 07:38:00 crc kubenswrapper[4749]: I0320 07:38:00.402792 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bzlr\" (UniqueName: \"kubernetes.io/projected/d8804ec1-0bcd-42e0-bd6c-63af36a81efb-kube-api-access-2bzlr\") pod \"auto-csr-approver-29566538-g7qxx\" (UID: \"d8804ec1-0bcd-42e0-bd6c-63af36a81efb\") " pod="openshift-infra/auto-csr-approver-29566538-g7qxx" Mar 20 07:38:00 crc kubenswrapper[4749]: I0320 07:38:00.435353 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bzlr\" (UniqueName: \"kubernetes.io/projected/d8804ec1-0bcd-42e0-bd6c-63af36a81efb-kube-api-access-2bzlr\") pod \"auto-csr-approver-29566538-g7qxx\" (UID: \"d8804ec1-0bcd-42e0-bd6c-63af36a81efb\") " pod="openshift-infra/auto-csr-approver-29566538-g7qxx" Mar 20 07:38:00 crc kubenswrapper[4749]: I0320 07:38:00.503050 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566538-g7qxx" Mar 20 07:38:00 crc kubenswrapper[4749]: I0320 07:38:00.948475 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566538-g7qxx"] Mar 20 07:38:00 crc kubenswrapper[4749]: W0320 07:38:00.960719 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8804ec1_0bcd_42e0_bd6c_63af36a81efb.slice/crio-2aaf262506f0da541f5d8551e89c60df14b0a8a1e2b6e1b9c48965fab03f3f29 WatchSource:0}: Error finding container 2aaf262506f0da541f5d8551e89c60df14b0a8a1e2b6e1b9c48965fab03f3f29: Status 404 returned error can't find the container with id 2aaf262506f0da541f5d8551e89c60df14b0a8a1e2b6e1b9c48965fab03f3f29 Mar 20 07:38:00 crc kubenswrapper[4749]: I0320 07:38:00.963484 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 07:38:01 crc kubenswrapper[4749]: I0320 07:38:01.896478 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566538-g7qxx" event={"ID":"d8804ec1-0bcd-42e0-bd6c-63af36a81efb","Type":"ContainerStarted","Data":"2aaf262506f0da541f5d8551e89c60df14b0a8a1e2b6e1b9c48965fab03f3f29"} Mar 20 07:38:02 crc kubenswrapper[4749]: I0320 07:38:02.909990 4749 generic.go:334] "Generic (PLEG): container finished" podID="d8804ec1-0bcd-42e0-bd6c-63af36a81efb" containerID="d68b5348b6e53fa83b8c33c432c700631f999ced8e97b2c3e1f926f08ed5a9f7" exitCode=0 Mar 20 07:38:02 crc kubenswrapper[4749]: I0320 07:38:02.910080 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566538-g7qxx" event={"ID":"d8804ec1-0bcd-42e0-bd6c-63af36a81efb","Type":"ContainerDied","Data":"d68b5348b6e53fa83b8c33c432c700631f999ced8e97b2c3e1f926f08ed5a9f7"} Mar 20 07:38:04 crc kubenswrapper[4749]: I0320 07:38:04.256695 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566538-g7qxx" Mar 20 07:38:04 crc kubenswrapper[4749]: I0320 07:38:04.374168 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bzlr\" (UniqueName: \"kubernetes.io/projected/d8804ec1-0bcd-42e0-bd6c-63af36a81efb-kube-api-access-2bzlr\") pod \"d8804ec1-0bcd-42e0-bd6c-63af36a81efb\" (UID: \"d8804ec1-0bcd-42e0-bd6c-63af36a81efb\") " Mar 20 07:38:04 crc kubenswrapper[4749]: I0320 07:38:04.380235 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8804ec1-0bcd-42e0-bd6c-63af36a81efb-kube-api-access-2bzlr" (OuterVolumeSpecName: "kube-api-access-2bzlr") pod "d8804ec1-0bcd-42e0-bd6c-63af36a81efb" (UID: "d8804ec1-0bcd-42e0-bd6c-63af36a81efb"). InnerVolumeSpecName "kube-api-access-2bzlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:38:04 crc kubenswrapper[4749]: I0320 07:38:04.476195 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bzlr\" (UniqueName: \"kubernetes.io/projected/d8804ec1-0bcd-42e0-bd6c-63af36a81efb-kube-api-access-2bzlr\") on node \"crc\" DevicePath \"\"" Mar 20 07:38:04 crc kubenswrapper[4749]: I0320 07:38:04.514725 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:38:04 crc kubenswrapper[4749]: I0320 07:38:04.514778 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:38:04 crc kubenswrapper[4749]: I0320 07:38:04.928958 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566538-g7qxx" event={"ID":"d8804ec1-0bcd-42e0-bd6c-63af36a81efb","Type":"ContainerDied","Data":"2aaf262506f0da541f5d8551e89c60df14b0a8a1e2b6e1b9c48965fab03f3f29"} Mar 20 07:38:04 crc kubenswrapper[4749]: I0320 07:38:04.929018 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2aaf262506f0da541f5d8551e89c60df14b0a8a1e2b6e1b9c48965fab03f3f29" Mar 20 07:38:04 crc kubenswrapper[4749]: I0320 07:38:04.929043 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566538-g7qxx" Mar 20 07:38:05 crc kubenswrapper[4749]: I0320 07:38:05.330167 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566532-vqmck"] Mar 20 07:38:05 crc kubenswrapper[4749]: I0320 07:38:05.341570 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566532-vqmck"] Mar 20 07:38:06 crc kubenswrapper[4749]: I0320 07:38:06.178044 4749 scope.go:117] "RemoveContainer" containerID="ef7fc647645cb5bd965319e15caea7a582e346e4421913d237600d9c58b5377a" Mar 20 07:38:06 crc kubenswrapper[4749]: E0320 07:38:06.178443 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:38:06 crc kubenswrapper[4749]: I0320 07:38:06.190220 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d08b93fb-b8b6-4ec4-a812-fa127e9519ae" path="/var/lib/kubelet/pods/d08b93fb-b8b6-4ec4-a812-fa127e9519ae/volumes" Mar 20 07:38:09 crc kubenswrapper[4749]: I0320 07:38:09.178226 4749 scope.go:117] "RemoveContainer" containerID="14d60c4857f48c0f7a46df5633942dd48b8898b1846bf13dd59a861469751bf0" Mar 20 07:38:09 crc kubenswrapper[4749]: E0320 07:38:09.178961 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:38:19 crc kubenswrapper[4749]: I0320 07:38:19.177610 4749 scope.go:117] "RemoveContainer" containerID="ef7fc647645cb5bd965319e15caea7a582e346e4421913d237600d9c58b5377a" Mar 20 07:38:20 crc kubenswrapper[4749]: I0320 07:38:20.077367 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerStarted","Data":"17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa"} Mar 20 07:38:20 crc kubenswrapper[4749]: I0320 07:38:20.077887 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 07:38:21 crc kubenswrapper[4749]: I0320 07:38:21.177849 4749 scope.go:117] "RemoveContainer" containerID="14d60c4857f48c0f7a46df5633942dd48b8898b1846bf13dd59a861469751bf0" Mar 20 07:38:22 crc kubenswrapper[4749]: I0320 07:38:22.103361 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerStarted","Data":"1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2"} Mar 20 07:38:22 crc kubenswrapper[4749]: I0320 07:38:22.104258 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:38:24 crc kubenswrapper[4749]: I0320 07:38:24.121229 4749 generic.go:334] "Generic (PLEG): container finished" podID="8db06e36-0b00-4157-9345-69449da3e85f" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" exitCode=0 Mar 20 07:38:24 crc kubenswrapper[4749]: I0320 07:38:24.121315 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerDied","Data":"17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa"} Mar 20 07:38:24 crc kubenswrapper[4749]: I0320 07:38:24.121664 4749 scope.go:117] "RemoveContainer" containerID="ef7fc647645cb5bd965319e15caea7a582e346e4421913d237600d9c58b5377a" Mar 20 07:38:24 crc kubenswrapper[4749]: I0320 07:38:24.122356 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:38:24 crc kubenswrapper[4749]: E0320 07:38:24.122633 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:38:26 crc kubenswrapper[4749]: I0320 07:38:26.148816 4749 generic.go:334] "Generic (PLEG): container finished" podID="8b9b402f-2d95-48f5-98d8-497d90956ba2" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" exitCode=0 Mar 20 07:38:26 crc kubenswrapper[4749]: I0320 07:38:26.148903 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerDied","Data":"1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2"} Mar 20 07:38:26 crc kubenswrapper[4749]: I0320 07:38:26.149230 4749 scope.go:117] "RemoveContainer" containerID="14d60c4857f48c0f7a46df5633942dd48b8898b1846bf13dd59a861469751bf0" Mar 20 07:38:26 crc kubenswrapper[4749]: I0320 07:38:26.150514 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:38:26 crc kubenswrapper[4749]: E0320 07:38:26.151381 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:38:29 crc kubenswrapper[4749]: I0320 07:38:29.004151 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4hfhn"] Mar 20 07:38:29 crc kubenswrapper[4749]: E0320 07:38:29.005908 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8804ec1-0bcd-42e0-bd6c-63af36a81efb" containerName="oc" Mar 20 07:38:29 crc kubenswrapper[4749]: I0320 07:38:29.006005 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8804ec1-0bcd-42e0-bd6c-63af36a81efb" containerName="oc" Mar 20 07:38:29 crc kubenswrapper[4749]: I0320 07:38:29.006526 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8804ec1-0bcd-42e0-bd6c-63af36a81efb" containerName="oc" Mar 20 07:38:29 crc kubenswrapper[4749]: I0320 07:38:29.010090 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4hfhn" Mar 20 07:38:29 crc kubenswrapper[4749]: I0320 07:38:29.020848 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4hfhn"] Mar 20 07:38:29 crc kubenswrapper[4749]: I0320 07:38:29.100704 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be51b648-ce40-402c-a3ce-6eb8b78d13d0-utilities\") pod \"redhat-marketplace-4hfhn\" (UID: \"be51b648-ce40-402c-a3ce-6eb8b78d13d0\") " pod="openshift-marketplace/redhat-marketplace-4hfhn" Mar 20 07:38:29 crc kubenswrapper[4749]: I0320 07:38:29.100875 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hfb5\" (UniqueName: \"kubernetes.io/projected/be51b648-ce40-402c-a3ce-6eb8b78d13d0-kube-api-access-5hfb5\") pod \"redhat-marketplace-4hfhn\" (UID: \"be51b648-ce40-402c-a3ce-6eb8b78d13d0\") " pod="openshift-marketplace/redhat-marketplace-4hfhn" Mar 20 07:38:29 crc kubenswrapper[4749]: I0320 07:38:29.100961 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be51b648-ce40-402c-a3ce-6eb8b78d13d0-catalog-content\") pod \"redhat-marketplace-4hfhn\" (UID: \"be51b648-ce40-402c-a3ce-6eb8b78d13d0\") " pod="openshift-marketplace/redhat-marketplace-4hfhn" Mar 20 07:38:29 crc kubenswrapper[4749]: I0320 07:38:29.202299 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be51b648-ce40-402c-a3ce-6eb8b78d13d0-utilities\") pod \"redhat-marketplace-4hfhn\" (UID: \"be51b648-ce40-402c-a3ce-6eb8b78d13d0\") " pod="openshift-marketplace/redhat-marketplace-4hfhn" Mar 20 07:38:29 crc kubenswrapper[4749]: I0320 07:38:29.202382 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hfb5\" (UniqueName: \"kubernetes.io/projected/be51b648-ce40-402c-a3ce-6eb8b78d13d0-kube-api-access-5hfb5\") pod \"redhat-marketplace-4hfhn\" (UID: \"be51b648-ce40-402c-a3ce-6eb8b78d13d0\") " pod="openshift-marketplace/redhat-marketplace-4hfhn" Mar 20 07:38:29 crc kubenswrapper[4749]: I0320 07:38:29.202437 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be51b648-ce40-402c-a3ce-6eb8b78d13d0-catalog-content\") pod \"redhat-marketplace-4hfhn\" (UID: \"be51b648-ce40-402c-a3ce-6eb8b78d13d0\") " pod="openshift-marketplace/redhat-marketplace-4hfhn" Mar 20 07:38:29 crc kubenswrapper[4749]: I0320 07:38:29.202873 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be51b648-ce40-402c-a3ce-6eb8b78d13d0-utilities\") pod \"redhat-marketplace-4hfhn\" (UID: \"be51b648-ce40-402c-a3ce-6eb8b78d13d0\") " pod="openshift-marketplace/redhat-marketplace-4hfhn" Mar 20 07:38:29 crc kubenswrapper[4749]: I0320 07:38:29.202908 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be51b648-ce40-402c-a3ce-6eb8b78d13d0-catalog-content\") pod \"redhat-marketplace-4hfhn\" (UID: \"be51b648-ce40-402c-a3ce-6eb8b78d13d0\") " pod="openshift-marketplace/redhat-marketplace-4hfhn" Mar 20 07:38:29 crc kubenswrapper[4749]: I0320 07:38:29.230827 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hfb5\" (UniqueName: \"kubernetes.io/projected/be51b648-ce40-402c-a3ce-6eb8b78d13d0-kube-api-access-5hfb5\") pod \"redhat-marketplace-4hfhn\" (UID: \"be51b648-ce40-402c-a3ce-6eb8b78d13d0\") " pod="openshift-marketplace/redhat-marketplace-4hfhn" Mar 20 07:38:29 crc kubenswrapper[4749]: I0320 07:38:29.346075 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4hfhn" Mar 20 07:38:29 crc kubenswrapper[4749]: I0320 07:38:29.851008 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4hfhn"] Mar 20 07:38:29 crc kubenswrapper[4749]: W0320 07:38:29.851820 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe51b648_ce40_402c_a3ce_6eb8b78d13d0.slice/crio-319708234eb6ba6845f63a44cdbbc5c7b2fdb7888e81f9fb2ce37e9276fb6778 WatchSource:0}: Error finding container 319708234eb6ba6845f63a44cdbbc5c7b2fdb7888e81f9fb2ce37e9276fb6778: Status 404 returned error can't find the container with id 319708234eb6ba6845f63a44cdbbc5c7b2fdb7888e81f9fb2ce37e9276fb6778 Mar 20 07:38:30 crc kubenswrapper[4749]: I0320 07:38:30.188192 4749 generic.go:334] "Generic (PLEG): container finished" podID="be51b648-ce40-402c-a3ce-6eb8b78d13d0" containerID="3e431b6c21ab2997151dc0d92c8de09db58725153cb7d0342092d03fb99e4ff6" exitCode=0 Mar 20 07:38:30 crc kubenswrapper[4749]: I0320 07:38:30.194456 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4hfhn" event={"ID":"be51b648-ce40-402c-a3ce-6eb8b78d13d0","Type":"ContainerDied","Data":"3e431b6c21ab2997151dc0d92c8de09db58725153cb7d0342092d03fb99e4ff6"} Mar 20 07:38:30 crc kubenswrapper[4749]: I0320 07:38:30.194518 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4hfhn" event={"ID":"be51b648-ce40-402c-a3ce-6eb8b78d13d0","Type":"ContainerStarted","Data":"319708234eb6ba6845f63a44cdbbc5c7b2fdb7888e81f9fb2ce37e9276fb6778"} Mar 20 07:38:31 crc kubenswrapper[4749]: I0320 07:38:31.202269 4749 generic.go:334] "Generic (PLEG): container finished" podID="be51b648-ce40-402c-a3ce-6eb8b78d13d0" containerID="8d54dac09b1273d9a4e2d4cf692c4576acdac3a82dd088ff9d0fc54be0b11a35" exitCode=0 Mar 20 07:38:31 crc kubenswrapper[4749]: I0320 07:38:31.202354 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4hfhn" event={"ID":"be51b648-ce40-402c-a3ce-6eb8b78d13d0","Type":"ContainerDied","Data":"8d54dac09b1273d9a4e2d4cf692c4576acdac3a82dd088ff9d0fc54be0b11a35"} Mar 20 07:38:32 crc kubenswrapper[4749]: I0320 07:38:32.214879 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4hfhn" event={"ID":"be51b648-ce40-402c-a3ce-6eb8b78d13d0","Type":"ContainerStarted","Data":"69e1bf12c93f078132c6d1b1e3156992285d01d35e36ec05d5df71a3c578d720"} Mar 20 07:38:32 crc kubenswrapper[4749]: I0320 07:38:32.247313 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4hfhn" podStartSLOduration=2.823542802 podStartE2EDuration="4.247246006s" podCreationTimestamp="2026-03-20 07:38:28 +0000 UTC" firstStartedPulling="2026-03-20 07:38:30.18992371 +0000 UTC m=+1546.739581387" lastFinishedPulling="2026-03-20 07:38:31.613626934 +0000 UTC m=+1548.163284591" observedRunningTime="2026-03-20 07:38:32.237579181 +0000 UTC m=+1548.787236838" watchObservedRunningTime="2026-03-20 07:38:32.247246006 +0000 UTC m=+1548.796903693" Mar 20 07:38:34 crc kubenswrapper[4749]: I0320 07:38:34.520244 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:38:34 crc kubenswrapper[4749]: I0320 07:38:34.520745 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:38:34 crc kubenswrapper[4749]: I0320 07:38:34.520878 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:38:34 crc kubenswrapper[4749]: I0320 07:38:34.522670 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563"} pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 07:38:34 crc kubenswrapper[4749]: I0320 07:38:34.522837 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" containerID="cri-o://4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" gracePeriod=600 Mar 20 07:38:34 crc kubenswrapper[4749]: E0320 07:38:34.648074 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:38:35 crc kubenswrapper[4749]: I0320 07:38:35.241392 4749 generic.go:334] "Generic (PLEG): container finished" podID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" exitCode=0 Mar 20 07:38:35 crc kubenswrapper[4749]: I0320 07:38:35.241435 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerDied","Data":"4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563"} Mar 20 07:38:35 crc kubenswrapper[4749]: I0320 07:38:35.241489 4749 scope.go:117] "RemoveContainer" containerID="91c7008ee23efd7c3f17163220decd2750e56886d85eb31073250a59b5bc0138" Mar 20 07:38:35 crc kubenswrapper[4749]: I0320 07:38:35.242035 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:38:35 crc kubenswrapper[4749]: E0320 07:38:35.242246 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:38:39 crc kubenswrapper[4749]: I0320 07:38:39.177127 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:38:39 crc kubenswrapper[4749]: E0320 07:38:39.178118 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:38:39 crc kubenswrapper[4749]: I0320 07:38:39.346741 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4hfhn" Mar 20 07:38:39 crc kubenswrapper[4749]: I0320 07:38:39.346790 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4hfhn" Mar 20 07:38:39 crc kubenswrapper[4749]: I0320 07:38:39.399265 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4hfhn" Mar 20 07:38:40 crc kubenswrapper[4749]: I0320 07:38:40.177982 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:38:40 crc kubenswrapper[4749]: E0320 07:38:40.179615 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:38:40 crc kubenswrapper[4749]: I0320 07:38:40.345197 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4hfhn" Mar 20 07:38:40 crc kubenswrapper[4749]: I0320 07:38:40.407408 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4hfhn"] Mar 20 07:38:42 crc kubenswrapper[4749]: I0320 07:38:42.306451 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4hfhn" podUID="be51b648-ce40-402c-a3ce-6eb8b78d13d0" containerName="registry-server" containerID="cri-o://69e1bf12c93f078132c6d1b1e3156992285d01d35e36ec05d5df71a3c578d720" gracePeriod=2 Mar 20 07:38:42 crc kubenswrapper[4749]: I0320 07:38:42.812062 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4hfhn" Mar 20 07:38:42 crc kubenswrapper[4749]: I0320 07:38:42.942469 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hfb5\" (UniqueName: \"kubernetes.io/projected/be51b648-ce40-402c-a3ce-6eb8b78d13d0-kube-api-access-5hfb5\") pod \"be51b648-ce40-402c-a3ce-6eb8b78d13d0\" (UID: \"be51b648-ce40-402c-a3ce-6eb8b78d13d0\") " Mar 20 07:38:42 crc kubenswrapper[4749]: I0320 07:38:42.942626 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be51b648-ce40-402c-a3ce-6eb8b78d13d0-catalog-content\") pod \"be51b648-ce40-402c-a3ce-6eb8b78d13d0\" (UID: \"be51b648-ce40-402c-a3ce-6eb8b78d13d0\") " Mar 20 07:38:42 crc kubenswrapper[4749]: I0320 07:38:42.943569 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be51b648-ce40-402c-a3ce-6eb8b78d13d0-utilities\") pod \"be51b648-ce40-402c-a3ce-6eb8b78d13d0\" (UID: \"be51b648-ce40-402c-a3ce-6eb8b78d13d0\") " Mar 20 07:38:42 crc kubenswrapper[4749]: I0320 07:38:42.944713 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be51b648-ce40-402c-a3ce-6eb8b78d13d0-utilities" (OuterVolumeSpecName: "utilities") pod "be51b648-ce40-402c-a3ce-6eb8b78d13d0" (UID: "be51b648-ce40-402c-a3ce-6eb8b78d13d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:38:42 crc kubenswrapper[4749]: I0320 07:38:42.951626 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be51b648-ce40-402c-a3ce-6eb8b78d13d0-kube-api-access-5hfb5" (OuterVolumeSpecName: "kube-api-access-5hfb5") pod "be51b648-ce40-402c-a3ce-6eb8b78d13d0" (UID: "be51b648-ce40-402c-a3ce-6eb8b78d13d0"). InnerVolumeSpecName "kube-api-access-5hfb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:38:42 crc kubenswrapper[4749]: I0320 07:38:42.995002 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be51b648-ce40-402c-a3ce-6eb8b78d13d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be51b648-ce40-402c-a3ce-6eb8b78d13d0" (UID: "be51b648-ce40-402c-a3ce-6eb8b78d13d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:38:43 crc kubenswrapper[4749]: I0320 07:38:43.045967 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be51b648-ce40-402c-a3ce-6eb8b78d13d0-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:38:43 crc kubenswrapper[4749]: I0320 07:38:43.046025 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hfb5\" (UniqueName: \"kubernetes.io/projected/be51b648-ce40-402c-a3ce-6eb8b78d13d0-kube-api-access-5hfb5\") on node \"crc\" DevicePath \"\"" Mar 20 07:38:43 crc kubenswrapper[4749]: I0320 07:38:43.046043 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be51b648-ce40-402c-a3ce-6eb8b78d13d0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:38:43 crc kubenswrapper[4749]: I0320 07:38:43.324345 4749 generic.go:334] "Generic (PLEG): container finished" podID="be51b648-ce40-402c-a3ce-6eb8b78d13d0" containerID="69e1bf12c93f078132c6d1b1e3156992285d01d35e36ec05d5df71a3c578d720" exitCode=0 Mar 20 07:38:43 crc kubenswrapper[4749]: I0320 07:38:43.324436 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4hfhn" Mar 20 07:38:43 crc kubenswrapper[4749]: I0320 07:38:43.324433 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4hfhn" event={"ID":"be51b648-ce40-402c-a3ce-6eb8b78d13d0","Type":"ContainerDied","Data":"69e1bf12c93f078132c6d1b1e3156992285d01d35e36ec05d5df71a3c578d720"} Mar 20 07:38:43 crc kubenswrapper[4749]: I0320 07:38:43.324644 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4hfhn" event={"ID":"be51b648-ce40-402c-a3ce-6eb8b78d13d0","Type":"ContainerDied","Data":"319708234eb6ba6845f63a44cdbbc5c7b2fdb7888e81f9fb2ce37e9276fb6778"} Mar 20 07:38:43 crc kubenswrapper[4749]: I0320 07:38:43.324683 4749 scope.go:117] "RemoveContainer" containerID="69e1bf12c93f078132c6d1b1e3156992285d01d35e36ec05d5df71a3c578d720" Mar 20 07:38:43 crc kubenswrapper[4749]: I0320 07:38:43.373618 4749 scope.go:117] "RemoveContainer" containerID="8d54dac09b1273d9a4e2d4cf692c4576acdac3a82dd088ff9d0fc54be0b11a35" Mar 20 07:38:43 crc kubenswrapper[4749]: I0320 07:38:43.388965 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4hfhn"] Mar 20 07:38:43 crc kubenswrapper[4749]: I0320 07:38:43.410725 4749 scope.go:117] "RemoveContainer" containerID="3e431b6c21ab2997151dc0d92c8de09db58725153cb7d0342092d03fb99e4ff6" Mar 20 07:38:43 crc kubenswrapper[4749]: I0320 07:38:43.433015 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4hfhn"] Mar 20 07:38:43 crc kubenswrapper[4749]: I0320 07:38:43.445204 4749 scope.go:117] "RemoveContainer" containerID="69e1bf12c93f078132c6d1b1e3156992285d01d35e36ec05d5df71a3c578d720" Mar 20 07:38:43 crc kubenswrapper[4749]: E0320 07:38:43.445686 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69e1bf12c93f078132c6d1b1e3156992285d01d35e36ec05d5df71a3c578d720\": container with ID starting with 69e1bf12c93f078132c6d1b1e3156992285d01d35e36ec05d5df71a3c578d720 not found: ID does not exist" containerID="69e1bf12c93f078132c6d1b1e3156992285d01d35e36ec05d5df71a3c578d720" Mar 20 07:38:43 crc kubenswrapper[4749]: I0320 07:38:43.445733 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69e1bf12c93f078132c6d1b1e3156992285d01d35e36ec05d5df71a3c578d720"} err="failed to get container status \"69e1bf12c93f078132c6d1b1e3156992285d01d35e36ec05d5df71a3c578d720\": rpc error: code = NotFound desc = could not find container \"69e1bf12c93f078132c6d1b1e3156992285d01d35e36ec05d5df71a3c578d720\": container with ID starting with 69e1bf12c93f078132c6d1b1e3156992285d01d35e36ec05d5df71a3c578d720 not found: ID does not exist" Mar 20 07:38:43 crc kubenswrapper[4749]: I0320 07:38:43.445766 4749 scope.go:117] "RemoveContainer" containerID="8d54dac09b1273d9a4e2d4cf692c4576acdac3a82dd088ff9d0fc54be0b11a35" Mar 20 07:38:43 crc kubenswrapper[4749]: E0320 07:38:43.446261 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d54dac09b1273d9a4e2d4cf692c4576acdac3a82dd088ff9d0fc54be0b11a35\": container with ID starting with 8d54dac09b1273d9a4e2d4cf692c4576acdac3a82dd088ff9d0fc54be0b11a35 not found: ID does not exist" containerID="8d54dac09b1273d9a4e2d4cf692c4576acdac3a82dd088ff9d0fc54be0b11a35" Mar 20 07:38:43 crc kubenswrapper[4749]: I0320 07:38:43.446317 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d54dac09b1273d9a4e2d4cf692c4576acdac3a82dd088ff9d0fc54be0b11a35"} err="failed to get container status \"8d54dac09b1273d9a4e2d4cf692c4576acdac3a82dd088ff9d0fc54be0b11a35\": rpc error: code = NotFound desc = could not find container \"8d54dac09b1273d9a4e2d4cf692c4576acdac3a82dd088ff9d0fc54be0b11a35\": container with ID starting with 8d54dac09b1273d9a4e2d4cf692c4576acdac3a82dd088ff9d0fc54be0b11a35 not found: ID does not exist" Mar 20 07:38:43 crc kubenswrapper[4749]: I0320 07:38:43.446343 4749 scope.go:117] "RemoveContainer" containerID="3e431b6c21ab2997151dc0d92c8de09db58725153cb7d0342092d03fb99e4ff6" Mar 20 07:38:43 crc kubenswrapper[4749]: E0320 07:38:43.446703 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e431b6c21ab2997151dc0d92c8de09db58725153cb7d0342092d03fb99e4ff6\": container with ID starting with 3e431b6c21ab2997151dc0d92c8de09db58725153cb7d0342092d03fb99e4ff6 not found: ID does not exist" containerID="3e431b6c21ab2997151dc0d92c8de09db58725153cb7d0342092d03fb99e4ff6" Mar 20 07:38:43 crc kubenswrapper[4749]: I0320 07:38:43.446737 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e431b6c21ab2997151dc0d92c8de09db58725153cb7d0342092d03fb99e4ff6"} err="failed to get container status \"3e431b6c21ab2997151dc0d92c8de09db58725153cb7d0342092d03fb99e4ff6\": rpc error: code = NotFound desc = could not find container \"3e431b6c21ab2997151dc0d92c8de09db58725153cb7d0342092d03fb99e4ff6\": container with ID starting with 3e431b6c21ab2997151dc0d92c8de09db58725153cb7d0342092d03fb99e4ff6 not found: ID does not exist" Mar 20 07:38:44 crc kubenswrapper[4749]: I0320 07:38:44.198501 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be51b648-ce40-402c-a3ce-6eb8b78d13d0" path="/var/lib/kubelet/pods/be51b648-ce40-402c-a3ce-6eb8b78d13d0/volumes" Mar 20 07:38:46 crc kubenswrapper[4749]: I0320 07:38:46.177863 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:38:46 crc kubenswrapper[4749]: E0320 07:38:46.178440 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:38:52 crc kubenswrapper[4749]: I0320 07:38:52.177631 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:38:52 crc kubenswrapper[4749]: E0320 07:38:52.178515 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:38:54 crc kubenswrapper[4749]: I0320 07:38:54.183568 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:38:54 crc kubenswrapper[4749]: E0320 07:38:54.183832 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:38:54 crc kubenswrapper[4749]: I0320 07:38:54.840206 4749 scope.go:117] "RemoveContainer" containerID="a434af2bf39377c27394825b723ff6b8a39c0413b943fd05a8b143ba92ee7dc8" Mar 20 07:38:54 crc kubenswrapper[4749]: I0320 07:38:54.869497 4749 scope.go:117] "RemoveContainer" containerID="5da4b407aba12ab438460c455de94433793fbff7f93e53cae1e23a30e6882c77" Mar 20 07:38:57 crc kubenswrapper[4749]: I0320 07:38:57.179129 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:38:57 crc kubenswrapper[4749]: E0320 07:38:57.179443 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:39:04 crc kubenswrapper[4749]: I0320 07:39:04.184391 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:39:04 crc kubenswrapper[4749]: E0320 07:39:04.185348 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:39:05 crc kubenswrapper[4749]: I0320 07:39:05.177548 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:39:05 crc kubenswrapper[4749]: E0320 07:39:05.177836 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:39:08 crc kubenswrapper[4749]: I0320 07:39:08.177089 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:39:08 crc kubenswrapper[4749]: E0320 07:39:08.177590 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:39:08 crc kubenswrapper[4749]: I0320 07:39:08.893900 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wqj78"] Mar 20 07:39:08 crc kubenswrapper[4749]: E0320 07:39:08.894257 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be51b648-ce40-402c-a3ce-6eb8b78d13d0" containerName="extract-content" Mar 20 07:39:08 crc kubenswrapper[4749]: I0320 07:39:08.894279 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="be51b648-ce40-402c-a3ce-6eb8b78d13d0" containerName="extract-content" Mar 20 07:39:08 crc kubenswrapper[4749]: E0320 07:39:08.894312 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be51b648-ce40-402c-a3ce-6eb8b78d13d0" containerName="registry-server" Mar 20 07:39:08 crc kubenswrapper[4749]: I0320 07:39:08.894318 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="be51b648-ce40-402c-a3ce-6eb8b78d13d0" containerName="registry-server" Mar 20 07:39:08 crc kubenswrapper[4749]: E0320 07:39:08.894345 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be51b648-ce40-402c-a3ce-6eb8b78d13d0" containerName="extract-utilities" Mar 20 07:39:08 crc kubenswrapper[4749]: I0320 07:39:08.894352 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="be51b648-ce40-402c-a3ce-6eb8b78d13d0" containerName="extract-utilities" Mar 20 07:39:08 crc kubenswrapper[4749]: I0320 07:39:08.894581 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="be51b648-ce40-402c-a3ce-6eb8b78d13d0" containerName="registry-server" Mar 20 07:39:08 crc kubenswrapper[4749]: I0320 07:39:08.895928 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wqj78" Mar 20 07:39:08 crc kubenswrapper[4749]: I0320 07:39:08.938853 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wqj78"] Mar 20 07:39:09 crc kubenswrapper[4749]: I0320 07:39:09.017677 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f690860-7538-466c-ac6b-4dc85764cc22-utilities\") pod \"certified-operators-wqj78\" (UID: \"9f690860-7538-466c-ac6b-4dc85764cc22\") " pod="openshift-marketplace/certified-operators-wqj78" Mar 20 07:39:09 crc kubenswrapper[4749]: I0320 07:39:09.017777 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrt45\" (UniqueName: \"kubernetes.io/projected/9f690860-7538-466c-ac6b-4dc85764cc22-kube-api-access-hrt45\") pod \"certified-operators-wqj78\" (UID: \"9f690860-7538-466c-ac6b-4dc85764cc22\") " pod="openshift-marketplace/certified-operators-wqj78" Mar 20 07:39:09 crc kubenswrapper[4749]: I0320 07:39:09.017954 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f690860-7538-466c-ac6b-4dc85764cc22-catalog-content\") pod \"certified-operators-wqj78\" (UID: \"9f690860-7538-466c-ac6b-4dc85764cc22\") " pod="openshift-marketplace/certified-operators-wqj78" Mar 20 07:39:09 crc kubenswrapper[4749]: I0320 07:39:09.119421 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f690860-7538-466c-ac6b-4dc85764cc22-utilities\") pod \"certified-operators-wqj78\" (UID: \"9f690860-7538-466c-ac6b-4dc85764cc22\") " pod="openshift-marketplace/certified-operators-wqj78" Mar 20 07:39:09 crc kubenswrapper[4749]: I0320 07:39:09.119501 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrt45\" (UniqueName: \"kubernetes.io/projected/9f690860-7538-466c-ac6b-4dc85764cc22-kube-api-access-hrt45\") pod \"certified-operators-wqj78\" (UID: \"9f690860-7538-466c-ac6b-4dc85764cc22\") " pod="openshift-marketplace/certified-operators-wqj78" Mar 20 07:39:09 crc kubenswrapper[4749]: I0320 07:39:09.119560 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f690860-7538-466c-ac6b-4dc85764cc22-catalog-content\") pod \"certified-operators-wqj78\" (UID: \"9f690860-7538-466c-ac6b-4dc85764cc22\") " pod="openshift-marketplace/certified-operators-wqj78" Mar 20 07:39:09 crc kubenswrapper[4749]: I0320 07:39:09.119988 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f690860-7538-466c-ac6b-4dc85764cc22-utilities\") pod \"certified-operators-wqj78\" (UID: \"9f690860-7538-466c-ac6b-4dc85764cc22\") " pod="openshift-marketplace/certified-operators-wqj78" Mar 20 07:39:09 crc kubenswrapper[4749]: I0320 07:39:09.120145 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f690860-7538-466c-ac6b-4dc85764cc22-catalog-content\") pod \"certified-operators-wqj78\" (UID: \"9f690860-7538-466c-ac6b-4dc85764cc22\") " pod="openshift-marketplace/certified-operators-wqj78" Mar 20 07:39:09 crc kubenswrapper[4749]: I0320 07:39:09.139262 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrt45\" (UniqueName: \"kubernetes.io/projected/9f690860-7538-466c-ac6b-4dc85764cc22-kube-api-access-hrt45\") pod \"certified-operators-wqj78\" (UID: \"9f690860-7538-466c-ac6b-4dc85764cc22\") " pod="openshift-marketplace/certified-operators-wqj78" Mar 20 07:39:09 crc kubenswrapper[4749]: I0320 07:39:09.219146 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wqj78" Mar 20 07:39:09 crc kubenswrapper[4749]: I0320 07:39:09.480696 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wqj78"] Mar 20 07:39:09 crc kubenswrapper[4749]: I0320 07:39:09.585300 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqj78" event={"ID":"9f690860-7538-466c-ac6b-4dc85764cc22","Type":"ContainerStarted","Data":"3820804bff27a5b9dbb0da5134c80846af2c325f24e52e41c192d89b9fbba248"} Mar 20 07:39:10 crc kubenswrapper[4749]: I0320 07:39:10.597020 4749 generic.go:334] "Generic (PLEG): container finished" podID="9f690860-7538-466c-ac6b-4dc85764cc22" containerID="95b800183a85b33cfbd3ee501da4c4e0f6d372ae794d8ec7efbb79a8875e1750" exitCode=0 Mar 20 07:39:10 crc kubenswrapper[4749]: I0320 07:39:10.597085 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqj78" event={"ID":"9f690860-7538-466c-ac6b-4dc85764cc22","Type":"ContainerDied","Data":"95b800183a85b33cfbd3ee501da4c4e0f6d372ae794d8ec7efbb79a8875e1750"} Mar 20 07:39:11 crc kubenswrapper[4749]: I0320 07:39:11.608304 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqj78" event={"ID":"9f690860-7538-466c-ac6b-4dc85764cc22","Type":"ContainerStarted","Data":"bfc2ff6ba3bb74552dba1b6e8dd398e70129c20bdeb06c1b1bc37cd81abde4d0"} Mar 20 07:39:12 crc kubenswrapper[4749]: I0320 07:39:12.619114 4749 generic.go:334] "Generic (PLEG): container finished" podID="9f690860-7538-466c-ac6b-4dc85764cc22" containerID="bfc2ff6ba3bb74552dba1b6e8dd398e70129c20bdeb06c1b1bc37cd81abde4d0" exitCode=0 Mar 20 07:39:12 crc kubenswrapper[4749]: I0320 07:39:12.619371 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqj78" event={"ID":"9f690860-7538-466c-ac6b-4dc85764cc22","Type":"ContainerDied","Data":"bfc2ff6ba3bb74552dba1b6e8dd398e70129c20bdeb06c1b1bc37cd81abde4d0"} Mar 20 07:39:13 crc kubenswrapper[4749]: I0320 07:39:13.633087 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqj78" event={"ID":"9f690860-7538-466c-ac6b-4dc85764cc22","Type":"ContainerStarted","Data":"71f5faaef1064e127b2a2cee4f3a453364867584a267aadedb4175c1ff65df92"} Mar 20 07:39:13 crc kubenswrapper[4749]: I0320 07:39:13.656765 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wqj78" podStartSLOduration=3.237518284 podStartE2EDuration="5.656732427s" podCreationTimestamp="2026-03-20 07:39:08 +0000 UTC" firstStartedPulling="2026-03-20 07:39:10.601664609 +0000 UTC m=+1587.151322256" lastFinishedPulling="2026-03-20 07:39:13.020878752 +0000 UTC m=+1589.570536399" observedRunningTime="2026-03-20 07:39:13.654885323 +0000 UTC m=+1590.204543020" watchObservedRunningTime="2026-03-20 07:39:13.656732427 +0000 UTC m=+1590.206390114" Mar 20 07:39:17 crc kubenswrapper[4749]: I0320 07:39:17.178205 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:39:17 crc kubenswrapper[4749]: E0320 07:39:17.179188 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:39:18 crc kubenswrapper[4749]: I0320 07:39:18.177662 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:39:18 crc kubenswrapper[4749]: E0320 07:39:18.178705 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:39:19 crc kubenswrapper[4749]: I0320 07:39:19.177448 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:39:19 crc kubenswrapper[4749]: E0320 07:39:19.177753 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:39:19 crc kubenswrapper[4749]: I0320 07:39:19.220003 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wqj78" Mar 20 07:39:19 crc kubenswrapper[4749]: I0320 07:39:19.220057 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wqj78" Mar 20 07:39:19 crc kubenswrapper[4749]: I0320 07:39:19.271922 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wqj78" Mar 20 07:39:19 crc kubenswrapper[4749]: I0320 07:39:19.721993 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wqj78" Mar 20 07:39:19 crc kubenswrapper[4749]: I0320 07:39:19.774103 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wqj78"] Mar 20 07:39:21 crc kubenswrapper[4749]: I0320 07:39:21.706196 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wqj78" podUID="9f690860-7538-466c-ac6b-4dc85764cc22" containerName="registry-server" containerID="cri-o://71f5faaef1064e127b2a2cee4f3a453364867584a267aadedb4175c1ff65df92" gracePeriod=2 Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.259808 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wqj78" Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.342250 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrt45\" (UniqueName: \"kubernetes.io/projected/9f690860-7538-466c-ac6b-4dc85764cc22-kube-api-access-hrt45\") pod \"9f690860-7538-466c-ac6b-4dc85764cc22\" (UID: \"9f690860-7538-466c-ac6b-4dc85764cc22\") " Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.342304 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f690860-7538-466c-ac6b-4dc85764cc22-utilities\") pod \"9f690860-7538-466c-ac6b-4dc85764cc22\" (UID: \"9f690860-7538-466c-ac6b-4dc85764cc22\") " Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.342350 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f690860-7538-466c-ac6b-4dc85764cc22-catalog-content\") pod \"9f690860-7538-466c-ac6b-4dc85764cc22\" (UID: \"9f690860-7538-466c-ac6b-4dc85764cc22\") " Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.343192 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f690860-7538-466c-ac6b-4dc85764cc22-utilities" (OuterVolumeSpecName: "utilities") pod "9f690860-7538-466c-ac6b-4dc85764cc22" (UID: "9f690860-7538-466c-ac6b-4dc85764cc22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.348324 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f690860-7538-466c-ac6b-4dc85764cc22-kube-api-access-hrt45" (OuterVolumeSpecName: "kube-api-access-hrt45") pod "9f690860-7538-466c-ac6b-4dc85764cc22" (UID: "9f690860-7538-466c-ac6b-4dc85764cc22"). InnerVolumeSpecName "kube-api-access-hrt45". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.402552 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f690860-7538-466c-ac6b-4dc85764cc22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f690860-7538-466c-ac6b-4dc85764cc22" (UID: "9f690860-7538-466c-ac6b-4dc85764cc22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.444180 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrt45\" (UniqueName: \"kubernetes.io/projected/9f690860-7538-466c-ac6b-4dc85764cc22-kube-api-access-hrt45\") on node \"crc\" DevicePath \"\"" Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.444222 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f690860-7538-466c-ac6b-4dc85764cc22-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.444232 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f690860-7538-466c-ac6b-4dc85764cc22-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.716881 4749 generic.go:334] "Generic (PLEG): container finished" podID="9f690860-7538-466c-ac6b-4dc85764cc22" containerID="71f5faaef1064e127b2a2cee4f3a453364867584a267aadedb4175c1ff65df92" exitCode=0 Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.716954 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wqj78" Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.716952 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqj78" event={"ID":"9f690860-7538-466c-ac6b-4dc85764cc22","Type":"ContainerDied","Data":"71f5faaef1064e127b2a2cee4f3a453364867584a267aadedb4175c1ff65df92"} Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.717352 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqj78" event={"ID":"9f690860-7538-466c-ac6b-4dc85764cc22","Type":"ContainerDied","Data":"3820804bff27a5b9dbb0da5134c80846af2c325f24e52e41c192d89b9fbba248"} Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.717380 4749 scope.go:117] "RemoveContainer" containerID="71f5faaef1064e127b2a2cee4f3a453364867584a267aadedb4175c1ff65df92" Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.739233 4749 scope.go:117] "RemoveContainer" containerID="bfc2ff6ba3bb74552dba1b6e8dd398e70129c20bdeb06c1b1bc37cd81abde4d0" Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.749925 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wqj78"] Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.755792 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wqj78"] Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.784168 4749 scope.go:117] "RemoveContainer" containerID="95b800183a85b33cfbd3ee501da4c4e0f6d372ae794d8ec7efbb79a8875e1750" Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.807412 4749 scope.go:117] "RemoveContainer" containerID="71f5faaef1064e127b2a2cee4f3a453364867584a267aadedb4175c1ff65df92" Mar 20 07:39:22 crc kubenswrapper[4749]: E0320 07:39:22.807861 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71f5faaef1064e127b2a2cee4f3a453364867584a267aadedb4175c1ff65df92\": container with ID starting with 71f5faaef1064e127b2a2cee4f3a453364867584a267aadedb4175c1ff65df92 not found: ID does not exist" containerID="71f5faaef1064e127b2a2cee4f3a453364867584a267aadedb4175c1ff65df92" Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.807898 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71f5faaef1064e127b2a2cee4f3a453364867584a267aadedb4175c1ff65df92"} err="failed to get container status \"71f5faaef1064e127b2a2cee4f3a453364867584a267aadedb4175c1ff65df92\": rpc error: code = NotFound desc = could not find container \"71f5faaef1064e127b2a2cee4f3a453364867584a267aadedb4175c1ff65df92\": container with ID starting with 71f5faaef1064e127b2a2cee4f3a453364867584a267aadedb4175c1ff65df92 not found: ID does not exist" Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.807933 4749 scope.go:117] "RemoveContainer" containerID="bfc2ff6ba3bb74552dba1b6e8dd398e70129c20bdeb06c1b1bc37cd81abde4d0" Mar 20 07:39:22 crc kubenswrapper[4749]: E0320 07:39:22.808739 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfc2ff6ba3bb74552dba1b6e8dd398e70129c20bdeb06c1b1bc37cd81abde4d0\": container with ID starting with bfc2ff6ba3bb74552dba1b6e8dd398e70129c20bdeb06c1b1bc37cd81abde4d0 not found: ID does not exist" containerID="bfc2ff6ba3bb74552dba1b6e8dd398e70129c20bdeb06c1b1bc37cd81abde4d0" Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.808765 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfc2ff6ba3bb74552dba1b6e8dd398e70129c20bdeb06c1b1bc37cd81abde4d0"} err="failed to get container status \"bfc2ff6ba3bb74552dba1b6e8dd398e70129c20bdeb06c1b1bc37cd81abde4d0\": rpc error: code = NotFound desc = could not find container \"bfc2ff6ba3bb74552dba1b6e8dd398e70129c20bdeb06c1b1bc37cd81abde4d0\": container with ID starting with bfc2ff6ba3bb74552dba1b6e8dd398e70129c20bdeb06c1b1bc37cd81abde4d0 not found: ID does not exist" Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.808779 4749 scope.go:117] "RemoveContainer" containerID="95b800183a85b33cfbd3ee501da4c4e0f6d372ae794d8ec7efbb79a8875e1750" Mar 20 07:39:22 crc kubenswrapper[4749]: E0320 07:39:22.809246 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95b800183a85b33cfbd3ee501da4c4e0f6d372ae794d8ec7efbb79a8875e1750\": container with ID starting with 95b800183a85b33cfbd3ee501da4c4e0f6d372ae794d8ec7efbb79a8875e1750 not found: ID does not exist" containerID="95b800183a85b33cfbd3ee501da4c4e0f6d372ae794d8ec7efbb79a8875e1750" Mar 20 07:39:22 crc kubenswrapper[4749]: I0320 07:39:22.809266 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b800183a85b33cfbd3ee501da4c4e0f6d372ae794d8ec7efbb79a8875e1750"} err="failed to get container status \"95b800183a85b33cfbd3ee501da4c4e0f6d372ae794d8ec7efbb79a8875e1750\": rpc error: code = NotFound desc = could not find container \"95b800183a85b33cfbd3ee501da4c4e0f6d372ae794d8ec7efbb79a8875e1750\": container with ID starting with 95b800183a85b33cfbd3ee501da4c4e0f6d372ae794d8ec7efbb79a8875e1750 not found: ID does not exist" Mar 20 07:39:24 crc kubenswrapper[4749]: I0320 07:39:24.192680 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f690860-7538-466c-ac6b-4dc85764cc22" path="/var/lib/kubelet/pods/9f690860-7538-466c-ac6b-4dc85764cc22/volumes" Mar 20 07:39:31 crc kubenswrapper[4749]: I0320 07:39:31.177762 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:39:31 crc kubenswrapper[4749]: I0320 07:39:31.178165 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:39:31 crc kubenswrapper[4749]: E0320 07:39:31.178342 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:39:31 crc kubenswrapper[4749]: E0320 07:39:31.178346 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:39:32 crc kubenswrapper[4749]: I0320 07:39:32.178580 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:39:32 crc kubenswrapper[4749]: E0320 07:39:32.179148 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:39:45 crc kubenswrapper[4749]: I0320 07:39:45.177644 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:39:45 crc kubenswrapper[4749]: I0320 07:39:45.178082 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:39:45 crc kubenswrapper[4749]: E0320 07:39:45.178302 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:39:45 crc kubenswrapper[4749]: E0320 07:39:45.178411 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:39:46 crc kubenswrapper[4749]: I0320 07:39:46.178269 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:39:46 crc kubenswrapper[4749]: E0320 07:39:46.179001 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:39:57 crc kubenswrapper[4749]: I0320 07:39:57.177489 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:39:57 crc kubenswrapper[4749]: E0320 07:39:57.178237 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:40:00 crc kubenswrapper[4749]: I0320 07:40:00.153696 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566540-nm2rc"] Mar 20 07:40:00 crc kubenswrapper[4749]: E0320 07:40:00.154373 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f690860-7538-466c-ac6b-4dc85764cc22" containerName="registry-server" Mar 20 07:40:00 crc kubenswrapper[4749]: I0320 07:40:00.154389 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f690860-7538-466c-ac6b-4dc85764cc22" containerName="registry-server" Mar 20 07:40:00 crc kubenswrapper[4749]: E0320 07:40:00.154425 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f690860-7538-466c-ac6b-4dc85764cc22" containerName="extract-utilities" Mar 20 07:40:00 crc kubenswrapper[4749]: I0320 07:40:00.154433 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f690860-7538-466c-ac6b-4dc85764cc22" containerName="extract-utilities" Mar 20 07:40:00 crc kubenswrapper[4749]: E0320 07:40:00.154452 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f690860-7538-466c-ac6b-4dc85764cc22" containerName="extract-content" Mar 20 07:40:00 crc kubenswrapper[4749]: I0320 07:40:00.154462 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f690860-7538-466c-ac6b-4dc85764cc22" containerName="extract-content" Mar 20 07:40:00 crc kubenswrapper[4749]: I0320 07:40:00.154672 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f690860-7538-466c-ac6b-4dc85764cc22" containerName="registry-server" Mar 20 07:40:00 crc kubenswrapper[4749]: I0320 07:40:00.155240 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566540-nm2rc" Mar 20 07:40:00 crc kubenswrapper[4749]: I0320 07:40:00.158566 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:40:00 crc kubenswrapper[4749]: I0320 07:40:00.158853 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:40:00 crc kubenswrapper[4749]: I0320 07:40:00.166172 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 07:40:00 crc kubenswrapper[4749]: I0320 07:40:00.173515 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566540-nm2rc"] Mar 20 07:40:00 crc kubenswrapper[4749]: I0320 07:40:00.177961 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:40:00 crc kubenswrapper[4749]: I0320 07:40:00.178213 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:40:00 crc kubenswrapper[4749]: E0320 07:40:00.178225 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:40:00 crc kubenswrapper[4749]: E0320 07:40:00.178520 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:40:00 crc kubenswrapper[4749]: I0320 07:40:00.278602 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtnhw\" (UniqueName: \"kubernetes.io/projected/dfac3eb1-0447-4727-bba6-62d133e9f4c1-kube-api-access-mtnhw\") pod \"auto-csr-approver-29566540-nm2rc\" (UID: \"dfac3eb1-0447-4727-bba6-62d133e9f4c1\") " pod="openshift-infra/auto-csr-approver-29566540-nm2rc" Mar 20 07:40:00 crc kubenswrapper[4749]: I0320 07:40:00.381258 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtnhw\" (UniqueName: \"kubernetes.io/projected/dfac3eb1-0447-4727-bba6-62d133e9f4c1-kube-api-access-mtnhw\") pod \"auto-csr-approver-29566540-nm2rc\" (UID: \"dfac3eb1-0447-4727-bba6-62d133e9f4c1\") " pod="openshift-infra/auto-csr-approver-29566540-nm2rc" Mar 20 07:40:00 crc kubenswrapper[4749]: I0320 07:40:00.414117 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtnhw\" (UniqueName: \"kubernetes.io/projected/dfac3eb1-0447-4727-bba6-62d133e9f4c1-kube-api-access-mtnhw\") pod \"auto-csr-approver-29566540-nm2rc\" (UID: \"dfac3eb1-0447-4727-bba6-62d133e9f4c1\") " pod="openshift-infra/auto-csr-approver-29566540-nm2rc" Mar 20 07:40:00 crc kubenswrapper[4749]: I0320 07:40:00.495031 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566540-nm2rc" Mar 20 07:40:00 crc kubenswrapper[4749]: I0320 07:40:00.992393 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566540-nm2rc"] Mar 20 07:40:01 crc kubenswrapper[4749]: I0320 07:40:01.045339 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566540-nm2rc" event={"ID":"dfac3eb1-0447-4727-bba6-62d133e9f4c1","Type":"ContainerStarted","Data":"147f0cc7109be8382d029c8bcc50ec9d0a881fb1d49b7c56c62e549cb8bf6c43"} Mar 20 07:40:03 crc kubenswrapper[4749]: I0320 07:40:03.065185 4749 generic.go:334] "Generic (PLEG): container finished" podID="dfac3eb1-0447-4727-bba6-62d133e9f4c1" containerID="d49308ccaa0bd5a1ddfc4b3b993308b14c3ecbb9810f9a41522cd26ed17e80b9" exitCode=0 Mar 20 07:40:03 crc kubenswrapper[4749]: I0320 07:40:03.065235 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566540-nm2rc" event={"ID":"dfac3eb1-0447-4727-bba6-62d133e9f4c1","Type":"ContainerDied","Data":"d49308ccaa0bd5a1ddfc4b3b993308b14c3ecbb9810f9a41522cd26ed17e80b9"} Mar 20 07:40:04 crc kubenswrapper[4749]: I0320 07:40:04.426011 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566540-nm2rc" Mar 20 07:40:04 crc kubenswrapper[4749]: I0320 07:40:04.560058 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtnhw\" (UniqueName: \"kubernetes.io/projected/dfac3eb1-0447-4727-bba6-62d133e9f4c1-kube-api-access-mtnhw\") pod \"dfac3eb1-0447-4727-bba6-62d133e9f4c1\" (UID: \"dfac3eb1-0447-4727-bba6-62d133e9f4c1\") " Mar 20 07:40:04 crc kubenswrapper[4749]: I0320 07:40:04.571460 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfac3eb1-0447-4727-bba6-62d133e9f4c1-kube-api-access-mtnhw" (OuterVolumeSpecName: "kube-api-access-mtnhw") pod "dfac3eb1-0447-4727-bba6-62d133e9f4c1" (UID: "dfac3eb1-0447-4727-bba6-62d133e9f4c1"). InnerVolumeSpecName "kube-api-access-mtnhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:40:04 crc kubenswrapper[4749]: I0320 07:40:04.663365 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtnhw\" (UniqueName: \"kubernetes.io/projected/dfac3eb1-0447-4727-bba6-62d133e9f4c1-kube-api-access-mtnhw\") on node \"crc\" DevicePath \"\"" Mar 20 07:40:05 crc kubenswrapper[4749]: I0320 07:40:05.094786 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566540-nm2rc" event={"ID":"dfac3eb1-0447-4727-bba6-62d133e9f4c1","Type":"ContainerDied","Data":"147f0cc7109be8382d029c8bcc50ec9d0a881fb1d49b7c56c62e549cb8bf6c43"} Mar 20 07:40:05 crc kubenswrapper[4749]: I0320 07:40:05.094857 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="147f0cc7109be8382d029c8bcc50ec9d0a881fb1d49b7c56c62e549cb8bf6c43" Mar 20 07:40:05 crc kubenswrapper[4749]: I0320 07:40:05.094896 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566540-nm2rc" Mar 20 07:40:05 crc kubenswrapper[4749]: I0320 07:40:05.516883 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566534-68w5t"] Mar 20 07:40:05 crc kubenswrapper[4749]: I0320 07:40:05.522972 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566534-68w5t"] Mar 20 07:40:06 crc kubenswrapper[4749]: I0320 07:40:06.190042 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f631d0f-f106-44f8-93f6-6e16924f5931" path="/var/lib/kubelet/pods/3f631d0f-f106-44f8-93f6-6e16924f5931/volumes" Mar 20 07:40:08 crc kubenswrapper[4749]: I0320 07:40:08.178120 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:40:08 crc kubenswrapper[4749]: E0320 07:40:08.178561 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:40:12 crc kubenswrapper[4749]: I0320 07:40:12.178153 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:40:12 crc kubenswrapper[4749]: E0320 07:40:12.179322 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:40:13 crc kubenswrapper[4749]: I0320 07:40:13.177229 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:40:13 crc kubenswrapper[4749]: E0320 07:40:13.177634 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:40:19 crc kubenswrapper[4749]: I0320 07:40:19.177060 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:40:19 crc kubenswrapper[4749]: E0320 07:40:19.177929 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:40:26 crc kubenswrapper[4749]: I0320 07:40:26.177755 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:40:26 crc kubenswrapper[4749]: E0320 07:40:26.178414 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:40:27 crc kubenswrapper[4749]: I0320 07:40:27.177870 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:40:27 crc kubenswrapper[4749]: E0320 07:40:27.178385 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:40:31 crc kubenswrapper[4749]: I0320 07:40:31.177547 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:40:31 crc kubenswrapper[4749]: E0320 07:40:31.178334 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:40:41 crc kubenswrapper[4749]: I0320 07:40:41.177637 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:40:41 crc kubenswrapper[4749]: I0320 07:40:41.178426 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:40:41 crc kubenswrapper[4749]: E0320 07:40:41.178841 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:40:41 crc kubenswrapper[4749]: E0320 07:40:41.178848 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:40:42 crc kubenswrapper[4749]: I0320 07:40:42.178182 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:40:42 crc kubenswrapper[4749]: E0320 07:40:42.178756 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:40:53 crc kubenswrapper[4749]: I0320 07:40:53.177753 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:40:53 crc kubenswrapper[4749]: E0320 07:40:53.178497 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:40:54 crc kubenswrapper[4749]: I0320 07:40:54.187854 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:40:54 crc kubenswrapper[4749]: E0320 07:40:54.188214 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:40:54 crc kubenswrapper[4749]: I0320 07:40:54.188854 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:40:54 crc kubenswrapper[4749]: E0320 07:40:54.189634 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:40:55 crc kubenswrapper[4749]: I0320 07:40:55.024356 4749 scope.go:117] "RemoveContainer" containerID="e7aba9a1598ed3930860f678ea88023de80376569b971db9a3061df46e140a21" Mar 20 07:41:06 crc kubenswrapper[4749]: I0320 07:41:06.177663 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:41:06 crc kubenswrapper[4749]: E0320 07:41:06.178542 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:41:07 crc kubenswrapper[4749]: I0320 07:41:07.177957 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:41:07 crc kubenswrapper[4749]: I0320 07:41:07.178109 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:41:07 crc kubenswrapper[4749]: E0320 07:41:07.178240 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:41:07 crc kubenswrapper[4749]: E0320 07:41:07.178470 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:41:17 crc kubenswrapper[4749]: I0320 07:41:17.178532 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:41:17 crc kubenswrapper[4749]: E0320 07:41:17.179701 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:41:18 crc kubenswrapper[4749]: I0320 07:41:18.177875 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:41:18 crc kubenswrapper[4749]: E0320 07:41:18.178636 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:41:20 crc kubenswrapper[4749]: I0320 07:41:20.177223 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:41:20 crc kubenswrapper[4749]: E0320 07:41:20.177501 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:41:29 crc kubenswrapper[4749]: I0320 07:41:29.178421 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:41:29 crc kubenswrapper[4749]: E0320 07:41:29.179165 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:41:32 crc kubenswrapper[4749]: I0320 07:41:32.177619 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:41:32 crc kubenswrapper[4749]: E0320 07:41:32.178187 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:41:34 crc kubenswrapper[4749]: I0320 07:41:34.191864 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:41:34 crc kubenswrapper[4749]: E0320 07:41:34.193216 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:41:41 crc kubenswrapper[4749]: I0320 07:41:41.177768 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:41:41 crc kubenswrapper[4749]: E0320 07:41:41.178519 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:41:47 crc kubenswrapper[4749]: I0320 07:41:47.177105 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:41:47 crc kubenswrapper[4749]: E0320 07:41:47.177885 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:41:48 crc kubenswrapper[4749]: I0320 07:41:48.177161 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:41:48 crc kubenswrapper[4749]: E0320 07:41:48.177441 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:41:54 crc kubenswrapper[4749]: I0320 07:41:54.183871 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:41:54 crc kubenswrapper[4749]: E0320 07:41:54.184801 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:42:00 crc kubenswrapper[4749]: I0320 07:42:00.153490 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566542-lps6m"] Mar 20 07:42:00 crc kubenswrapper[4749]: E0320 07:42:00.154882 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfac3eb1-0447-4727-bba6-62d133e9f4c1" containerName="oc" Mar 20 07:42:00 crc kubenswrapper[4749]: I0320 07:42:00.154915 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfac3eb1-0447-4727-bba6-62d133e9f4c1" containerName="oc" Mar 20 07:42:00 crc kubenswrapper[4749]: I0320 07:42:00.155376 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfac3eb1-0447-4727-bba6-62d133e9f4c1" containerName="oc" Mar 20 07:42:00 crc kubenswrapper[4749]: I0320 07:42:00.156461 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566542-lps6m" Mar 20 07:42:00 crc kubenswrapper[4749]: I0320 07:42:00.159641 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:42:00 crc kubenswrapper[4749]: I0320 07:42:00.160409 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:42:00 crc kubenswrapper[4749]: I0320 07:42:00.160581 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 07:42:00 crc kubenswrapper[4749]: I0320 07:42:00.163650 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566542-lps6m"] Mar 20 07:42:00 crc kubenswrapper[4749]: I0320 07:42:00.178356 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:42:00 crc kubenswrapper[4749]: E0320 07:42:00.178799 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:42:00 crc kubenswrapper[4749]: I0320 07:42:00.179640 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:42:00 crc kubenswrapper[4749]: E0320 07:42:00.179958 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:42:00 crc kubenswrapper[4749]: I0320 07:42:00.226414 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pkw2\" (UniqueName: \"kubernetes.io/projected/09126e47-94db-4c40-bc68-23606616605d-kube-api-access-4pkw2\") pod \"auto-csr-approver-29566542-lps6m\" (UID: \"09126e47-94db-4c40-bc68-23606616605d\") " pod="openshift-infra/auto-csr-approver-29566542-lps6m" Mar 20 07:42:00 crc kubenswrapper[4749]: I0320 07:42:00.328818 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pkw2\" (UniqueName: \"kubernetes.io/projected/09126e47-94db-4c40-bc68-23606616605d-kube-api-access-4pkw2\") pod \"auto-csr-approver-29566542-lps6m\" (UID: \"09126e47-94db-4c40-bc68-23606616605d\") " pod="openshift-infra/auto-csr-approver-29566542-lps6m" Mar 20 07:42:00 crc kubenswrapper[4749]: I0320 07:42:00.351164 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pkw2\" (UniqueName: \"kubernetes.io/projected/09126e47-94db-4c40-bc68-23606616605d-kube-api-access-4pkw2\") pod \"auto-csr-approver-29566542-lps6m\" (UID: \"09126e47-94db-4c40-bc68-23606616605d\") " pod="openshift-infra/auto-csr-approver-29566542-lps6m" Mar 20 07:42:00 crc kubenswrapper[4749]: I0320 07:42:00.501541 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566542-lps6m" Mar 20 07:42:00 crc kubenswrapper[4749]: I0320 07:42:00.941781 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566542-lps6m"] Mar 20 07:42:01 crc kubenswrapper[4749]: I0320 07:42:01.191447 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566542-lps6m" event={"ID":"09126e47-94db-4c40-bc68-23606616605d","Type":"ContainerStarted","Data":"0c4ec513ebaec0e0e07b05ea89dadbf27e8c29152f2a4478f9f75cdcffe58fd0"} Mar 20 07:42:02 crc kubenswrapper[4749]: I0320 07:42:02.204876 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566542-lps6m" event={"ID":"09126e47-94db-4c40-bc68-23606616605d","Type":"ContainerStarted","Data":"533bc999252f7f2716fcb93fcc15d78088e935ee8e2798a10a088a312763c003"} Mar 20 07:42:02 crc kubenswrapper[4749]: I0320 07:42:02.225049 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566542-lps6m" podStartSLOduration=1.413241934 podStartE2EDuration="2.225024557s" podCreationTimestamp="2026-03-20 07:42:00 +0000 UTC" firstStartedPulling="2026-03-20 07:42:00.944431929 +0000 UTC m=+1757.494089586" lastFinishedPulling="2026-03-20 07:42:01.756214522 +0000 UTC m=+1758.305872209" observedRunningTime="2026-03-20 07:42:02.223446648 +0000 UTC m=+1758.773104335" watchObservedRunningTime="2026-03-20 07:42:02.225024557 +0000 UTC m=+1758.774682234" Mar 20 07:42:03 crc kubenswrapper[4749]: I0320 07:42:03.217094 4749 generic.go:334] "Generic (PLEG): container finished" podID="09126e47-94db-4c40-bc68-23606616605d" containerID="533bc999252f7f2716fcb93fcc15d78088e935ee8e2798a10a088a312763c003" exitCode=0 Mar 20 07:42:03 crc kubenswrapper[4749]: I0320 07:42:03.217165 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566542-lps6m" event={"ID":"09126e47-94db-4c40-bc68-23606616605d","Type":"ContainerDied","Data":"533bc999252f7f2716fcb93fcc15d78088e935ee8e2798a10a088a312763c003"} Mar 20 07:42:04 crc kubenswrapper[4749]: I0320 07:42:04.631200 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566542-lps6m" Mar 20 07:42:04 crc kubenswrapper[4749]: I0320 07:42:04.711022 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pkw2\" (UniqueName: \"kubernetes.io/projected/09126e47-94db-4c40-bc68-23606616605d-kube-api-access-4pkw2\") pod \"09126e47-94db-4c40-bc68-23606616605d\" (UID: \"09126e47-94db-4c40-bc68-23606616605d\") " Mar 20 07:42:04 crc kubenswrapper[4749]: I0320 07:42:04.721585 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09126e47-94db-4c40-bc68-23606616605d-kube-api-access-4pkw2" (OuterVolumeSpecName: "kube-api-access-4pkw2") pod "09126e47-94db-4c40-bc68-23606616605d" (UID: "09126e47-94db-4c40-bc68-23606616605d"). InnerVolumeSpecName "kube-api-access-4pkw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:42:04 crc kubenswrapper[4749]: I0320 07:42:04.813012 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pkw2\" (UniqueName: \"kubernetes.io/projected/09126e47-94db-4c40-bc68-23606616605d-kube-api-access-4pkw2\") on node \"crc\" DevicePath \"\"" Mar 20 07:42:05 crc kubenswrapper[4749]: I0320 07:42:05.236582 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566542-lps6m" event={"ID":"09126e47-94db-4c40-bc68-23606616605d","Type":"ContainerDied","Data":"0c4ec513ebaec0e0e07b05ea89dadbf27e8c29152f2a4478f9f75cdcffe58fd0"} Mar 20 07:42:05 crc kubenswrapper[4749]: I0320 07:42:05.236657 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c4ec513ebaec0e0e07b05ea89dadbf27e8c29152f2a4478f9f75cdcffe58fd0" Mar 20 07:42:05 crc kubenswrapper[4749]: I0320 07:42:05.236793 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566542-lps6m" Mar 20 07:42:05 crc kubenswrapper[4749]: I0320 07:42:05.294597 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566536-v9tvd"] Mar 20 07:42:05 crc kubenswrapper[4749]: I0320 07:42:05.300797 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566536-v9tvd"] Mar 20 07:42:06 crc kubenswrapper[4749]: I0320 07:42:06.178769 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:42:06 crc kubenswrapper[4749]: E0320 07:42:06.184536 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:42:06 crc kubenswrapper[4749]: I0320 07:42:06.201015 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0ec3a89-57c0-46eb-b6fe-64cef0b74782" path="/var/lib/kubelet/pods/c0ec3a89-57c0-46eb-b6fe-64cef0b74782/volumes" Mar 20 07:42:10 crc kubenswrapper[4749]: I0320 07:42:10.050707 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-94fb-account-create-update-qpb8t"] Mar 20 07:42:10 crc kubenswrapper[4749]: I0320 07:42:10.071738 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-94fb-account-create-update-qpb8t"] Mar 20 07:42:10 crc kubenswrapper[4749]: I0320 07:42:10.082032 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-b25f-account-create-update-4prvn"] Mar 20 07:42:10 crc kubenswrapper[4749]: I0320 07:42:10.091429 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-b25f-account-create-update-4prvn"] Mar 20 07:42:10 crc kubenswrapper[4749]: I0320 07:42:10.100485 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-6cllh"] Mar 20 07:42:10 crc kubenswrapper[4749]: I0320 07:42:10.106778 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-563c-account-create-update-xjv9j"] Mar 20 07:42:10 crc kubenswrapper[4749]: I0320 07:42:10.112774 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-563c-account-create-update-xjv9j"] Mar 20 07:42:10 crc kubenswrapper[4749]: I0320 07:42:10.118112 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-6cllh"] Mar 20 07:42:10 crc kubenswrapper[4749]: I0320 07:42:10.124070 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-vvwnz"] Mar 20 07:42:10 crc kubenswrapper[4749]: I0320 07:42:10.129668 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-vvwnz"] Mar 20 07:42:10 crc kubenswrapper[4749]: I0320 07:42:10.135247 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-ds4q2"] Mar 20 07:42:10 crc kubenswrapper[4749]: I0320 07:42:10.140244 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-ds4q2"] Mar 20 07:42:10 crc kubenswrapper[4749]: I0320 07:42:10.187940 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dae8082-8d1a-448c-961f-bf0c58f0bd81" path="/var/lib/kubelet/pods/5dae8082-8d1a-448c-961f-bf0c58f0bd81/volumes" Mar 20 07:42:10 crc kubenswrapper[4749]: I0320 07:42:10.188914 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7811a5b-1577-4ecc-b54f-949bc39b0289" path="/var/lib/kubelet/pods/d7811a5b-1577-4ecc-b54f-949bc39b0289/volumes" Mar 20 07:42:10 crc kubenswrapper[4749]: I0320 07:42:10.189499 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5" path="/var/lib/kubelet/pods/db14fc2d-a32e-49a6-8d9c-9c6fd3e447f5/volumes" Mar 20 07:42:10 crc kubenswrapper[4749]: I0320 07:42:10.190064 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9ddc4f6-6f60-489c-bedb-44a31de6894e" path="/var/lib/kubelet/pods/e9ddc4f6-6f60-489c-bedb-44a31de6894e/volumes" Mar 20 07:42:10 crc kubenswrapper[4749]: I0320 07:42:10.191218 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f48a0db5-3834-46dd-a959-a6a4e67fc1dd" path="/var/lib/kubelet/pods/f48a0db5-3834-46dd-a959-a6a4e67fc1dd/volumes" Mar 20 07:42:10 crc kubenswrapper[4749]: I0320 07:42:10.191812 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f61da3ae-a72f-4d88-b8cc-38d0503649d8" path="/var/lib/kubelet/pods/f61da3ae-a72f-4d88-b8cc-38d0503649d8/volumes" Mar 20 07:42:12 crc kubenswrapper[4749]: I0320 07:42:12.177199 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:42:12 crc kubenswrapper[4749]: E0320 07:42:12.178043 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:42:15 crc kubenswrapper[4749]: I0320 07:42:15.067738 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-fvvnb"] Mar 20 07:42:15 crc kubenswrapper[4749]: I0320 07:42:15.077338 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-fvvnb"] Mar 20 07:42:15 crc kubenswrapper[4749]: I0320 07:42:15.177405 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:42:15 crc kubenswrapper[4749]: E0320 07:42:15.177741 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:42:16 crc kubenswrapper[4749]: I0320 07:42:16.185818 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cccea11-5b61-437f-bddb-888f138a1d3f" path="/var/lib/kubelet/pods/1cccea11-5b61-437f-bddb-888f138a1d3f/volumes" Mar 20 07:42:21 crc kubenswrapper[4749]: I0320 07:42:21.177701 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:42:21 crc kubenswrapper[4749]: E0320 07:42:21.178504 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:42:25 crc kubenswrapper[4749]: I0320 07:42:25.176854 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:42:25 crc kubenswrapper[4749]: E0320 07:42:25.177437 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:42:27 crc kubenswrapper[4749]: I0320 07:42:27.177444 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:42:27 crc kubenswrapper[4749]: E0320 07:42:27.178087 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:42:35 crc kubenswrapper[4749]: I0320 07:42:35.036618 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-qm6tz"] Mar 20 07:42:35 crc kubenswrapper[4749]: I0320 07:42:35.049079 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-qm6tz"] Mar 20 07:42:36 crc kubenswrapper[4749]: I0320 07:42:36.177924 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:42:36 crc kubenswrapper[4749]: E0320 07:42:36.178303 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:42:36 crc kubenswrapper[4749]: I0320 07:42:36.193581 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ae472d6-49bf-44a4-85a3-30e1dd169d3a" path="/var/lib/kubelet/pods/7ae472d6-49bf-44a4-85a3-30e1dd169d3a/volumes" Mar 20 07:42:37 crc kubenswrapper[4749]: I0320 07:42:37.177408 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:42:37 crc kubenswrapper[4749]: E0320 07:42:37.177904 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:42:39 crc kubenswrapper[4749]: I0320 07:42:39.178115 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:42:39 crc kubenswrapper[4749]: E0320 07:42:39.178882 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:42:49 crc kubenswrapper[4749]: I0320 07:42:49.177604 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:42:49 crc kubenswrapper[4749]: E0320 07:42:49.179406 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:42:50 crc kubenswrapper[4749]: I0320 07:42:50.177752 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:42:50 crc kubenswrapper[4749]: E0320 07:42:50.178208 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:42:50 crc kubenswrapper[4749]: I0320 07:42:50.179274 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:42:50 crc kubenswrapper[4749]: E0320 07:42:50.179762 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:42:55 crc kubenswrapper[4749]: I0320 07:42:55.114589 4749 scope.go:117] "RemoveContainer" containerID="03274338217e796f9048d6fd52cdf22751bb228c4e948da0fefd07fe76d1a02b" Mar 20 07:42:55 crc kubenswrapper[4749]: I0320 07:42:55.187329 4749 scope.go:117] "RemoveContainer" containerID="ffdce9f2c2b0948fa7c96d51232cc816332f88386d3cfa8e8e6a1162fd250742" Mar 20 07:42:55 crc kubenswrapper[4749]: I0320 07:42:55.226695 4749 scope.go:117] "RemoveContainer" containerID="ff4e7b076ebfd76e033599e596be9afea75dd61efe5c4cfab47bde686b3cad6a" Mar 20 07:42:55 crc kubenswrapper[4749]: I0320 07:42:55.256076 4749 scope.go:117] "RemoveContainer" containerID="cb8d13cd73d35aab6a3382d05cc06e4ed46c7b05ec3a43313054396575f48876" Mar 20 07:42:55 crc kubenswrapper[4749]: I0320 07:42:55.280271 4749 scope.go:117] "RemoveContainer" containerID="80d77ee4a1bf66d80163f8a5526304fdccc195168842114249293e02583e9d40" Mar 20 07:42:55 crc kubenswrapper[4749]: I0320 07:42:55.322768 4749 scope.go:117] "RemoveContainer" containerID="49b4f5449d47da38b079ed5043f4527a09a72ed466b1188f24387c92f34f2255" Mar 20 07:42:55 crc kubenswrapper[4749]: I0320 07:42:55.366528 4749 scope.go:117] "RemoveContainer" containerID="d717c0878684addb787c2829681853ed2653c938c3158afe5b2647421d5e6044" Mar 20 07:42:55 crc kubenswrapper[4749]: I0320 07:42:55.386932 4749 scope.go:117] "RemoveContainer" containerID="b1cc3cc1324d1ac28122d37be91909fea04a11b4e88444c7ab5db72214f77e7e" Mar 20 07:42:55 crc kubenswrapper[4749]: I0320 07:42:55.441576 4749 scope.go:117] "RemoveContainer" containerID="4ad168f5534165dde867123251fc544a3c2c5edd7514e8ed47cd8f3022b4a2a9" Mar 20 07:43:02 crc kubenswrapper[4749]: I0320 07:43:02.178167 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:43:02 crc kubenswrapper[4749]: E0320 07:43:02.179441 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:43:03 crc kubenswrapper[4749]: I0320 07:43:03.177806 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:43:03 crc kubenswrapper[4749]: E0320 07:43:03.178003 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:43:04 crc kubenswrapper[4749]: I0320 07:43:04.178060 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:43:04 crc kubenswrapper[4749]: E0320 07:43:04.178647 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:43:15 crc kubenswrapper[4749]: I0320 07:43:15.177779 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:43:15 crc kubenswrapper[4749]: I0320 07:43:15.178277 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:43:15 crc kubenswrapper[4749]: E0320 07:43:15.178346 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:43:15 crc kubenswrapper[4749]: E0320 07:43:15.178682 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:43:16 crc kubenswrapper[4749]: I0320 07:43:16.178722 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:43:16 crc kubenswrapper[4749]: E0320 07:43:16.179079 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:43:26 crc kubenswrapper[4749]: I0320 07:43:26.177592 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:43:26 crc kubenswrapper[4749]: E0320 07:43:26.178610 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:43:30 crc kubenswrapper[4749]: I0320 07:43:30.178580 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:43:30 crc kubenswrapper[4749]: I0320 07:43:30.179351 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:43:31 crc kubenswrapper[4749]: I0320 07:43:31.111737 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerStarted","Data":"0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158"} Mar 20 07:43:31 crc kubenswrapper[4749]: I0320 07:43:31.112413 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:43:31 crc kubenswrapper[4749]: I0320 07:43:31.114927 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerStarted","Data":"630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c"} Mar 20 07:43:31 crc kubenswrapper[4749]: I0320 07:43:31.115248 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 07:43:35 crc kubenswrapper[4749]: I0320 07:43:35.158206 4749 generic.go:334] "Generic (PLEG): container finished" podID="8b9b402f-2d95-48f5-98d8-497d90956ba2" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" exitCode=0 Mar 20 07:43:35 crc kubenswrapper[4749]: I0320 07:43:35.158654 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerDied","Data":"0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158"} Mar 20 07:43:35 crc kubenswrapper[4749]: I0320 07:43:35.158703 4749 scope.go:117] "RemoveContainer" containerID="1cd50d58afca40cad64e7875955d25b8686cb618117f93d78af3df8989a4b1c2" Mar 20 07:43:35 crc kubenswrapper[4749]: I0320 07:43:35.159691 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:43:35 crc kubenswrapper[4749]: E0320 07:43:35.160115 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:43:35 crc kubenswrapper[4749]: I0320 07:43:35.165481 4749 generic.go:334] "Generic (PLEG): container finished" podID="8db06e36-0b00-4157-9345-69449da3e85f" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" exitCode=0 Mar 20 07:43:35 crc kubenswrapper[4749]: I0320 07:43:35.165533 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerDied","Data":"630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c"} Mar 20 07:43:35 crc kubenswrapper[4749]: I0320 07:43:35.166183 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:43:35 crc kubenswrapper[4749]: E0320 07:43:35.166575 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:43:35 crc kubenswrapper[4749]: I0320 07:43:35.226509 4749 scope.go:117] "RemoveContainer" containerID="17c6354fdb5bb7b31802e55acc92a8922ba9fa7a3272f8ce11243a6cbe9be4fa" Mar 20 07:43:41 crc kubenswrapper[4749]: I0320 07:43:41.179460 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:43:42 crc kubenswrapper[4749]: I0320 07:43:42.246683 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerStarted","Data":"f0b6b46505f9df084ed8de0c0f1cf3091e394d293032dd62e13935f99ca383ee"} Mar 20 07:43:47 crc kubenswrapper[4749]: I0320 07:43:47.177702 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:43:47 crc kubenswrapper[4749]: E0320 07:43:47.178834 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:43:48 crc kubenswrapper[4749]: I0320 07:43:48.178846 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:43:48 crc kubenswrapper[4749]: E0320 07:43:48.179683 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:44:00 crc kubenswrapper[4749]: I0320 07:44:00.148192 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566544-jkm2b"] Mar 20 07:44:00 crc kubenswrapper[4749]: E0320 07:44:00.149077 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09126e47-94db-4c40-bc68-23606616605d" containerName="oc" Mar 20 07:44:00 crc kubenswrapper[4749]: I0320 07:44:00.149091 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="09126e47-94db-4c40-bc68-23606616605d" containerName="oc" Mar 20 07:44:00 crc kubenswrapper[4749]: I0320 07:44:00.149272 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="09126e47-94db-4c40-bc68-23606616605d" containerName="oc" Mar 20 07:44:00 crc kubenswrapper[4749]: I0320 07:44:00.149758 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566544-jkm2b" Mar 20 07:44:00 crc kubenswrapper[4749]: I0320 07:44:00.153352 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 07:44:00 crc kubenswrapper[4749]: I0320 07:44:00.153720 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:44:00 crc kubenswrapper[4749]: I0320 07:44:00.159956 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:44:00 crc kubenswrapper[4749]: I0320 07:44:00.160804 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566544-jkm2b"] Mar 20 07:44:00 crc kubenswrapper[4749]: I0320 07:44:00.178442 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:44:00 crc kubenswrapper[4749]: E0320 07:44:00.178738 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:44:00 crc kubenswrapper[4749]: I0320 07:44:00.316907 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qfrt\" (UniqueName: \"kubernetes.io/projected/9a388828-add8-4a36-a802-7a12bc486545-kube-api-access-9qfrt\") pod \"auto-csr-approver-29566544-jkm2b\" (UID: \"9a388828-add8-4a36-a802-7a12bc486545\") " pod="openshift-infra/auto-csr-approver-29566544-jkm2b" Mar 20 07:44:00 crc kubenswrapper[4749]: I0320 07:44:00.419980 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qfrt\" (UniqueName: \"kubernetes.io/projected/9a388828-add8-4a36-a802-7a12bc486545-kube-api-access-9qfrt\") pod \"auto-csr-approver-29566544-jkm2b\" (UID: \"9a388828-add8-4a36-a802-7a12bc486545\") " pod="openshift-infra/auto-csr-approver-29566544-jkm2b" Mar 20 07:44:00 crc kubenswrapper[4749]: I0320 07:44:00.445479 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qfrt\" (UniqueName: \"kubernetes.io/projected/9a388828-add8-4a36-a802-7a12bc486545-kube-api-access-9qfrt\") pod \"auto-csr-approver-29566544-jkm2b\" (UID: \"9a388828-add8-4a36-a802-7a12bc486545\") " pod="openshift-infra/auto-csr-approver-29566544-jkm2b" Mar 20 07:44:00 crc kubenswrapper[4749]: I0320 07:44:00.469239 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566544-jkm2b" Mar 20 07:44:00 crc kubenswrapper[4749]: I0320 07:44:00.925110 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566544-jkm2b"] Mar 20 07:44:00 crc kubenswrapper[4749]: I0320 07:44:00.929678 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 07:44:01 crc kubenswrapper[4749]: I0320 07:44:01.429610 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566544-jkm2b" event={"ID":"9a388828-add8-4a36-a802-7a12bc486545","Type":"ContainerStarted","Data":"931df4ff25be64deabcb39df7f6c4c03289fed7e8a14eac9d436a561d486b4f2"} Mar 20 07:44:02 crc kubenswrapper[4749]: I0320 07:44:02.177607 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:44:02 crc kubenswrapper[4749]: E0320 07:44:02.178480 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:44:02 crc kubenswrapper[4749]: I0320 07:44:02.439014 4749 generic.go:334] "Generic (PLEG): container finished" podID="9a388828-add8-4a36-a802-7a12bc486545" containerID="e5be83a5fc8f67aabb41d6f791b86d718017f21ece3e7b2b6ee9aaaf48f64558" exitCode=0 Mar 20 07:44:02 crc kubenswrapper[4749]: I0320 07:44:02.439061 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566544-jkm2b" event={"ID":"9a388828-add8-4a36-a802-7a12bc486545","Type":"ContainerDied","Data":"e5be83a5fc8f67aabb41d6f791b86d718017f21ece3e7b2b6ee9aaaf48f64558"} Mar 20 07:44:03 crc kubenswrapper[4749]: I0320 07:44:03.751048 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566544-jkm2b" Mar 20 07:44:03 crc kubenswrapper[4749]: I0320 07:44:03.875009 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qfrt\" (UniqueName: \"kubernetes.io/projected/9a388828-add8-4a36-a802-7a12bc486545-kube-api-access-9qfrt\") pod \"9a388828-add8-4a36-a802-7a12bc486545\" (UID: \"9a388828-add8-4a36-a802-7a12bc486545\") " Mar 20 07:44:03 crc kubenswrapper[4749]: I0320 07:44:03.886650 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a388828-add8-4a36-a802-7a12bc486545-kube-api-access-9qfrt" (OuterVolumeSpecName: "kube-api-access-9qfrt") pod "9a388828-add8-4a36-a802-7a12bc486545" (UID: "9a388828-add8-4a36-a802-7a12bc486545"). InnerVolumeSpecName "kube-api-access-9qfrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:44:03 crc kubenswrapper[4749]: I0320 07:44:03.977158 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qfrt\" (UniqueName: \"kubernetes.io/projected/9a388828-add8-4a36-a802-7a12bc486545-kube-api-access-9qfrt\") on node \"crc\" DevicePath \"\"" Mar 20 07:44:04 crc kubenswrapper[4749]: I0320 07:44:04.460248 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566544-jkm2b" event={"ID":"9a388828-add8-4a36-a802-7a12bc486545","Type":"ContainerDied","Data":"931df4ff25be64deabcb39df7f6c4c03289fed7e8a14eac9d436a561d486b4f2"} Mar 20 07:44:04 crc kubenswrapper[4749]: I0320 07:44:04.460302 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="931df4ff25be64deabcb39df7f6c4c03289fed7e8a14eac9d436a561d486b4f2" Mar 20 07:44:04 crc kubenswrapper[4749]: I0320 07:44:04.460363 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566544-jkm2b" Mar 20 07:44:04 crc kubenswrapper[4749]: I0320 07:44:04.827207 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566538-g7qxx"] Mar 20 07:44:04 crc kubenswrapper[4749]: I0320 07:44:04.834463 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566538-g7qxx"] Mar 20 07:44:06 crc kubenswrapper[4749]: I0320 07:44:06.190146 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8804ec1-0bcd-42e0-bd6c-63af36a81efb" path="/var/lib/kubelet/pods/d8804ec1-0bcd-42e0-bd6c-63af36a81efb/volumes" Mar 20 07:44:12 crc kubenswrapper[4749]: I0320 07:44:12.178408 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:44:12 crc kubenswrapper[4749]: E0320 07:44:12.181798 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:44:17 crc kubenswrapper[4749]: I0320 07:44:17.177793 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:44:17 crc kubenswrapper[4749]: E0320 07:44:17.178520 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:44:25 crc kubenswrapper[4749]: I0320 07:44:25.177325 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:44:25 crc kubenswrapper[4749]: E0320 07:44:25.178287 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:44:28 crc kubenswrapper[4749]: I0320 07:44:28.177555 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:44:28 crc kubenswrapper[4749]: E0320 07:44:28.178125 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:44:39 crc kubenswrapper[4749]: I0320 07:44:39.177759 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:44:39 crc kubenswrapper[4749]: E0320 07:44:39.178583 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:44:40 crc kubenswrapper[4749]: I0320 07:44:40.176938 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:44:40 crc kubenswrapper[4749]: E0320 07:44:40.177688 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:44:52 crc kubenswrapper[4749]: I0320 07:44:52.177357 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:44:52 crc kubenswrapper[4749]: E0320 07:44:52.178490 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:44:55 crc kubenswrapper[4749]: I0320 07:44:55.177694 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:44:55 crc kubenswrapper[4749]: E0320 07:44:55.178349 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:44:55 crc kubenswrapper[4749]: I0320 07:44:55.588069 4749 scope.go:117] "RemoveContainer" containerID="d68b5348b6e53fa83b8c33c432c700631f999ced8e97b2c3e1f926f08ed5a9f7" Mar 20 07:45:00 crc kubenswrapper[4749]: I0320 07:45:00.153646 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566545-nq9t6"] Mar 20 07:45:00 crc kubenswrapper[4749]: E0320 07:45:00.154491 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a388828-add8-4a36-a802-7a12bc486545" containerName="oc" Mar 20 07:45:00 crc kubenswrapper[4749]: I0320 07:45:00.154504 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a388828-add8-4a36-a802-7a12bc486545" containerName="oc" Mar 20 07:45:00 crc kubenswrapper[4749]: I0320 07:45:00.154695 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a388828-add8-4a36-a802-7a12bc486545" containerName="oc" Mar 20 07:45:00 crc kubenswrapper[4749]: I0320 07:45:00.155174 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-nq9t6" Mar 20 07:45:00 crc kubenswrapper[4749]: I0320 07:45:00.157670 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 07:45:00 crc kubenswrapper[4749]: I0320 07:45:00.157815 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 07:45:00 crc kubenswrapper[4749]: I0320 07:45:00.189095 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566545-nq9t6"] Mar 20 07:45:00 crc kubenswrapper[4749]: I0320 07:45:00.307921 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/edd035e5-d0a2-41ee-a0ae-e0909392a9d4-secret-volume\") pod \"collect-profiles-29566545-nq9t6\" (UID: \"edd035e5-d0a2-41ee-a0ae-e0909392a9d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-nq9t6" Mar 20 07:45:00 crc kubenswrapper[4749]: I0320 07:45:00.308448 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/edd035e5-d0a2-41ee-a0ae-e0909392a9d4-config-volume\") pod \"collect-profiles-29566545-nq9t6\" (UID: \"edd035e5-d0a2-41ee-a0ae-e0909392a9d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-nq9t6" Mar 20 07:45:00 crc kubenswrapper[4749]: I0320 07:45:00.308529 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcrfl\" (UniqueName: \"kubernetes.io/projected/edd035e5-d0a2-41ee-a0ae-e0909392a9d4-kube-api-access-hcrfl\") pod \"collect-profiles-29566545-nq9t6\" (UID: \"edd035e5-d0a2-41ee-a0ae-e0909392a9d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-nq9t6" Mar 20 07:45:00 crc kubenswrapper[4749]: I0320 07:45:00.410344 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/edd035e5-d0a2-41ee-a0ae-e0909392a9d4-secret-volume\") pod \"collect-profiles-29566545-nq9t6\" (UID: \"edd035e5-d0a2-41ee-a0ae-e0909392a9d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-nq9t6" Mar 20 07:45:00 crc kubenswrapper[4749]: I0320 07:45:00.410422 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/edd035e5-d0a2-41ee-a0ae-e0909392a9d4-config-volume\") pod \"collect-profiles-29566545-nq9t6\" (UID: \"edd035e5-d0a2-41ee-a0ae-e0909392a9d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-nq9t6" Mar 20 07:45:00 crc kubenswrapper[4749]: I0320 07:45:00.410460 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcrfl\" (UniqueName: \"kubernetes.io/projected/edd035e5-d0a2-41ee-a0ae-e0909392a9d4-kube-api-access-hcrfl\") pod \"collect-profiles-29566545-nq9t6\" (UID: \"edd035e5-d0a2-41ee-a0ae-e0909392a9d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-nq9t6" Mar 20 07:45:00 crc kubenswrapper[4749]: I0320 07:45:00.412956 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/edd035e5-d0a2-41ee-a0ae-e0909392a9d4-config-volume\") pod \"collect-profiles-29566545-nq9t6\" (UID: \"edd035e5-d0a2-41ee-a0ae-e0909392a9d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-nq9t6" Mar 20 07:45:00 crc kubenswrapper[4749]: I0320 07:45:00.429561 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/edd035e5-d0a2-41ee-a0ae-e0909392a9d4-secret-volume\") pod \"collect-profiles-29566545-nq9t6\" (UID: \"edd035e5-d0a2-41ee-a0ae-e0909392a9d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-nq9t6" Mar 20 07:45:00 crc kubenswrapper[4749]: I0320 07:45:00.431534 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcrfl\" (UniqueName: \"kubernetes.io/projected/edd035e5-d0a2-41ee-a0ae-e0909392a9d4-kube-api-access-hcrfl\") pod \"collect-profiles-29566545-nq9t6\" (UID: \"edd035e5-d0a2-41ee-a0ae-e0909392a9d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-nq9t6" Mar 20 07:45:00 crc kubenswrapper[4749]: I0320 07:45:00.481728 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-nq9t6" Mar 20 07:45:00 crc kubenswrapper[4749]: I0320 07:45:00.994052 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566545-nq9t6"] Mar 20 07:45:01 crc kubenswrapper[4749]: I0320 07:45:01.993559 4749 generic.go:334] "Generic (PLEG): container finished" podID="edd035e5-d0a2-41ee-a0ae-e0909392a9d4" containerID="151fa2e172c83b4ed6a3e2fb83a15d280e71578463b076eee33fcfe8b817c36a" exitCode=0 Mar 20 07:45:01 crc kubenswrapper[4749]: I0320 07:45:01.993873 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-nq9t6" event={"ID":"edd035e5-d0a2-41ee-a0ae-e0909392a9d4","Type":"ContainerDied","Data":"151fa2e172c83b4ed6a3e2fb83a15d280e71578463b076eee33fcfe8b817c36a"} Mar 20 07:45:01 crc kubenswrapper[4749]: I0320 07:45:01.993913 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-nq9t6" event={"ID":"edd035e5-d0a2-41ee-a0ae-e0909392a9d4","Type":"ContainerStarted","Data":"8033dd62af9c698cb0ac948604071cfc990a6bc173fbaab9f67135f1490321cc"} Mar 20 07:45:03 crc kubenswrapper[4749]: I0320 07:45:03.387665 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-nq9t6" Mar 20 07:45:03 crc kubenswrapper[4749]: I0320 07:45:03.561867 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/edd035e5-d0a2-41ee-a0ae-e0909392a9d4-secret-volume\") pod \"edd035e5-d0a2-41ee-a0ae-e0909392a9d4\" (UID: \"edd035e5-d0a2-41ee-a0ae-e0909392a9d4\") " Mar 20 07:45:03 crc kubenswrapper[4749]: I0320 07:45:03.562067 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/edd035e5-d0a2-41ee-a0ae-e0909392a9d4-config-volume\") pod \"edd035e5-d0a2-41ee-a0ae-e0909392a9d4\" (UID: \"edd035e5-d0a2-41ee-a0ae-e0909392a9d4\") " Mar 20 07:45:03 crc kubenswrapper[4749]: I0320 07:45:03.562093 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcrfl\" (UniqueName: \"kubernetes.io/projected/edd035e5-d0a2-41ee-a0ae-e0909392a9d4-kube-api-access-hcrfl\") pod \"edd035e5-d0a2-41ee-a0ae-e0909392a9d4\" (UID: \"edd035e5-d0a2-41ee-a0ae-e0909392a9d4\") " Mar 20 07:45:03 crc kubenswrapper[4749]: I0320 07:45:03.562842 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edd035e5-d0a2-41ee-a0ae-e0909392a9d4-config-volume" (OuterVolumeSpecName: "config-volume") pod "edd035e5-d0a2-41ee-a0ae-e0909392a9d4" (UID: "edd035e5-d0a2-41ee-a0ae-e0909392a9d4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 07:45:03 crc kubenswrapper[4749]: I0320 07:45:03.568988 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edd035e5-d0a2-41ee-a0ae-e0909392a9d4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "edd035e5-d0a2-41ee-a0ae-e0909392a9d4" (UID: "edd035e5-d0a2-41ee-a0ae-e0909392a9d4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 07:45:03 crc kubenswrapper[4749]: I0320 07:45:03.575449 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edd035e5-d0a2-41ee-a0ae-e0909392a9d4-kube-api-access-hcrfl" (OuterVolumeSpecName: "kube-api-access-hcrfl") pod "edd035e5-d0a2-41ee-a0ae-e0909392a9d4" (UID: "edd035e5-d0a2-41ee-a0ae-e0909392a9d4"). InnerVolumeSpecName "kube-api-access-hcrfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:45:03 crc kubenswrapper[4749]: I0320 07:45:03.664633 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/edd035e5-d0a2-41ee-a0ae-e0909392a9d4-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 07:45:03 crc kubenswrapper[4749]: I0320 07:45:03.664687 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/edd035e5-d0a2-41ee-a0ae-e0909392a9d4-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 07:45:03 crc kubenswrapper[4749]: I0320 07:45:03.664708 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcrfl\" (UniqueName: \"kubernetes.io/projected/edd035e5-d0a2-41ee-a0ae-e0909392a9d4-kube-api-access-hcrfl\") on node \"crc\" DevicePath \"\"" Mar 20 07:45:04 crc kubenswrapper[4749]: I0320 07:45:04.010175 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-nq9t6" event={"ID":"edd035e5-d0a2-41ee-a0ae-e0909392a9d4","Type":"ContainerDied","Data":"8033dd62af9c698cb0ac948604071cfc990a6bc173fbaab9f67135f1490321cc"} Mar 20 07:45:04 crc kubenswrapper[4749]: I0320 07:45:04.010218 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8033dd62af9c698cb0ac948604071cfc990a6bc173fbaab9f67135f1490321cc" Mar 20 07:45:04 crc kubenswrapper[4749]: I0320 07:45:04.010273 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566545-nq9t6" Mar 20 07:45:05 crc kubenswrapper[4749]: I0320 07:45:05.177924 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:45:05 crc kubenswrapper[4749]: E0320 07:45:05.178365 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:45:09 crc kubenswrapper[4749]: I0320 07:45:09.177785 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:45:09 crc kubenswrapper[4749]: E0320 07:45:09.178525 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:45:16 crc kubenswrapper[4749]: I0320 07:45:16.178470 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:45:16 crc kubenswrapper[4749]: E0320 07:45:16.179510 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:45:23 crc kubenswrapper[4749]: I0320 07:45:23.179454 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:45:23 crc kubenswrapper[4749]: E0320 07:45:23.180799 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:45:28 crc kubenswrapper[4749]: I0320 07:45:28.177590 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:45:28 crc kubenswrapper[4749]: E0320 07:45:28.178308 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:45:35 crc kubenswrapper[4749]: I0320 07:45:35.176975 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:45:35 crc kubenswrapper[4749]: E0320 07:45:35.177948 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:45:42 crc kubenswrapper[4749]: I0320 07:45:42.178010 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:45:42 crc kubenswrapper[4749]: E0320 07:45:42.178880 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:45:49 crc kubenswrapper[4749]: I0320 07:45:49.177169 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:45:49 crc kubenswrapper[4749]: E0320 07:45:49.178516 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:45:54 crc kubenswrapper[4749]: I0320 07:45:54.183540 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:45:54 crc kubenswrapper[4749]: E0320 07:45:54.185432 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:46:00 crc kubenswrapper[4749]: I0320 07:46:00.155604 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566546-wqdzs"] Mar 20 07:46:00 crc kubenswrapper[4749]: E0320 07:46:00.157325 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd035e5-d0a2-41ee-a0ae-e0909392a9d4" containerName="collect-profiles" Mar 20 07:46:00 crc kubenswrapper[4749]: I0320 07:46:00.157349 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd035e5-d0a2-41ee-a0ae-e0909392a9d4" containerName="collect-profiles" Mar 20 07:46:00 crc kubenswrapper[4749]: I0320 07:46:00.157689 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="edd035e5-d0a2-41ee-a0ae-e0909392a9d4" containerName="collect-profiles" Mar 20 07:46:00 crc kubenswrapper[4749]: I0320 07:46:00.158717 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566546-wqdzs" Mar 20 07:46:00 crc kubenswrapper[4749]: I0320 07:46:00.161724 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:46:00 crc kubenswrapper[4749]: I0320 07:46:00.161771 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 07:46:00 crc kubenswrapper[4749]: I0320 07:46:00.162366 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566546-wqdzs"] Mar 20 07:46:00 crc kubenswrapper[4749]: I0320 07:46:00.162913 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:46:00 crc kubenswrapper[4749]: I0320 07:46:00.355760 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd8vc\" (UniqueName: \"kubernetes.io/projected/2bef987c-8038-436d-a79e-c2346c61050b-kube-api-access-vd8vc\") pod \"auto-csr-approver-29566546-wqdzs\" (UID: \"2bef987c-8038-436d-a79e-c2346c61050b\") " pod="openshift-infra/auto-csr-approver-29566546-wqdzs" Mar 20 07:46:00 crc kubenswrapper[4749]: I0320 07:46:00.457749 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd8vc\" (UniqueName: \"kubernetes.io/projected/2bef987c-8038-436d-a79e-c2346c61050b-kube-api-access-vd8vc\") pod \"auto-csr-approver-29566546-wqdzs\" (UID: \"2bef987c-8038-436d-a79e-c2346c61050b\") " pod="openshift-infra/auto-csr-approver-29566546-wqdzs" Mar 20 07:46:00 crc kubenswrapper[4749]: I0320 07:46:00.485498 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd8vc\" (UniqueName: \"kubernetes.io/projected/2bef987c-8038-436d-a79e-c2346c61050b-kube-api-access-vd8vc\") pod \"auto-csr-approver-29566546-wqdzs\" (UID: \"2bef987c-8038-436d-a79e-c2346c61050b\") " pod="openshift-infra/auto-csr-approver-29566546-wqdzs" Mar 20 07:46:00 crc kubenswrapper[4749]: I0320 07:46:00.491041 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566546-wqdzs" Mar 20 07:46:00 crc kubenswrapper[4749]: I0320 07:46:00.916575 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566546-wqdzs"] Mar 20 07:46:01 crc kubenswrapper[4749]: I0320 07:46:01.178033 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:46:01 crc kubenswrapper[4749]: E0320 07:46:01.179746 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:46:01 crc kubenswrapper[4749]: I0320 07:46:01.543199 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566546-wqdzs" event={"ID":"2bef987c-8038-436d-a79e-c2346c61050b","Type":"ContainerStarted","Data":"df753685bbecbb69145d3f567ac7b2b9cb7892bf4c419a8395d0c19795b7027d"} Mar 20 07:46:02 crc kubenswrapper[4749]: I0320 07:46:02.560621 4749 generic.go:334] "Generic (PLEG): container finished" podID="2bef987c-8038-436d-a79e-c2346c61050b" containerID="eebcd41ddfad961cfc1f5283ed6419611c3a48008724ae585e93adf2005c59d6" exitCode=0 Mar 20 07:46:02 crc kubenswrapper[4749]: I0320 07:46:02.560674 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566546-wqdzs" event={"ID":"2bef987c-8038-436d-a79e-c2346c61050b","Type":"ContainerDied","Data":"eebcd41ddfad961cfc1f5283ed6419611c3a48008724ae585e93adf2005c59d6"} Mar 20 07:46:03 crc kubenswrapper[4749]: I0320 07:46:03.941862 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566546-wqdzs" Mar 20 07:46:04 crc kubenswrapper[4749]: I0320 07:46:04.016400 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd8vc\" (UniqueName: \"kubernetes.io/projected/2bef987c-8038-436d-a79e-c2346c61050b-kube-api-access-vd8vc\") pod \"2bef987c-8038-436d-a79e-c2346c61050b\" (UID: \"2bef987c-8038-436d-a79e-c2346c61050b\") " Mar 20 07:46:04 crc kubenswrapper[4749]: I0320 07:46:04.022591 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bef987c-8038-436d-a79e-c2346c61050b-kube-api-access-vd8vc" (OuterVolumeSpecName: "kube-api-access-vd8vc") pod "2bef987c-8038-436d-a79e-c2346c61050b" (UID: "2bef987c-8038-436d-a79e-c2346c61050b"). InnerVolumeSpecName "kube-api-access-vd8vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:46:04 crc kubenswrapper[4749]: I0320 07:46:04.118937 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd8vc\" (UniqueName: \"kubernetes.io/projected/2bef987c-8038-436d-a79e-c2346c61050b-kube-api-access-vd8vc\") on node \"crc\" DevicePath \"\"" Mar 20 07:46:04 crc kubenswrapper[4749]: I0320 07:46:04.514959 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:46:04 crc kubenswrapper[4749]: I0320 07:46:04.515410 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:46:04 crc kubenswrapper[4749]: I0320 07:46:04.578420 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566546-wqdzs" event={"ID":"2bef987c-8038-436d-a79e-c2346c61050b","Type":"ContainerDied","Data":"df753685bbecbb69145d3f567ac7b2b9cb7892bf4c419a8395d0c19795b7027d"} Mar 20 07:46:04 crc kubenswrapper[4749]: I0320 07:46:04.578463 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df753685bbecbb69145d3f567ac7b2b9cb7892bf4c419a8395d0c19795b7027d" Mar 20 07:46:04 crc kubenswrapper[4749]: I0320 07:46:04.578522 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566546-wqdzs" Mar 20 07:46:05 crc kubenswrapper[4749]: I0320 07:46:05.015959 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566540-nm2rc"] Mar 20 07:46:05 crc kubenswrapper[4749]: I0320 07:46:05.023278 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566540-nm2rc"] Mar 20 07:46:06 crc kubenswrapper[4749]: I0320 07:46:06.187843 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfac3eb1-0447-4727-bba6-62d133e9f4c1" path="/var/lib/kubelet/pods/dfac3eb1-0447-4727-bba6-62d133e9f4c1/volumes" Mar 20 07:46:07 crc kubenswrapper[4749]: I0320 07:46:07.178234 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:46:07 crc kubenswrapper[4749]: E0320 07:46:07.178974 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:46:12 crc kubenswrapper[4749]: I0320 07:46:12.177848 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:46:12 crc kubenswrapper[4749]: E0320 07:46:12.178852 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:46:18 crc kubenswrapper[4749]: I0320 07:46:18.177891 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:46:18 crc kubenswrapper[4749]: E0320 07:46:18.178706 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:46:24 crc kubenswrapper[4749]: I0320 07:46:24.184026 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:46:24 crc kubenswrapper[4749]: E0320 07:46:24.184926 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:46:31 crc kubenswrapper[4749]: I0320 07:46:31.178520 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:46:31 crc kubenswrapper[4749]: E0320 07:46:31.179568 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:46:34 crc kubenswrapper[4749]: I0320 07:46:34.515075 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:46:34 crc kubenswrapper[4749]: I0320 07:46:34.515151 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:46:36 crc kubenswrapper[4749]: I0320 07:46:36.178211 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:46:36 crc kubenswrapper[4749]: E0320 07:46:36.178975 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:46:45 crc kubenswrapper[4749]: I0320 07:46:45.178005 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:46:45 crc kubenswrapper[4749]: E0320 07:46:45.179054 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:46:50 crc kubenswrapper[4749]: I0320 07:46:50.178259 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:46:50 crc kubenswrapper[4749]: E0320 07:46:50.179727 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:46:55 crc kubenswrapper[4749]: I0320 07:46:55.715234 4749 scope.go:117] "RemoveContainer" containerID="d49308ccaa0bd5a1ddfc4b3b993308b14c3ecbb9810f9a41522cd26ed17e80b9" Mar 20 07:46:58 crc kubenswrapper[4749]: I0320 07:46:58.177356 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:46:58 crc kubenswrapper[4749]: E0320 07:46:58.177944 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:47:01 crc kubenswrapper[4749]: I0320 07:47:01.178533 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:47:01 crc kubenswrapper[4749]: E0320 07:47:01.179427 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:47:04 crc kubenswrapper[4749]: I0320 07:47:04.514413 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:47:04 crc kubenswrapper[4749]: I0320 07:47:04.514801 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:47:04 crc kubenswrapper[4749]: I0320 07:47:04.514873 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:47:04 crc kubenswrapper[4749]: I0320 07:47:04.515717 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f0b6b46505f9df084ed8de0c0f1cf3091e394d293032dd62e13935f99ca383ee"} pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 07:47:04 crc kubenswrapper[4749]: I0320 07:47:04.515851 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" containerID="cri-o://f0b6b46505f9df084ed8de0c0f1cf3091e394d293032dd62e13935f99ca383ee" gracePeriod=600 Mar 20 07:47:05 crc kubenswrapper[4749]: I0320 07:47:05.137726 4749 generic.go:334] "Generic (PLEG): container finished" podID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerID="f0b6b46505f9df084ed8de0c0f1cf3091e394d293032dd62e13935f99ca383ee" exitCode=0 Mar 20 07:47:05 crc kubenswrapper[4749]: I0320 07:47:05.137805 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerDied","Data":"f0b6b46505f9df084ed8de0c0f1cf3091e394d293032dd62e13935f99ca383ee"} Mar 20 07:47:05 crc kubenswrapper[4749]: I0320 07:47:05.138330 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerStarted","Data":"0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14"} Mar 20 07:47:05 crc kubenswrapper[4749]: I0320 07:47:05.138351 4749 scope.go:117] "RemoveContainer" containerID="4d8e78efd9340edd6f9a69c0288b8e640be4f341bfa8796261268cc4055c4563" Mar 20 07:47:09 crc kubenswrapper[4749]: I0320 07:47:09.178169 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:47:09 crc kubenswrapper[4749]: E0320 07:47:09.179603 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:47:12 crc kubenswrapper[4749]: I0320 07:47:12.177571 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:47:12 crc kubenswrapper[4749]: E0320 07:47:12.178318 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:47:21 crc kubenswrapper[4749]: I0320 07:47:21.178537 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:47:21 crc kubenswrapper[4749]: E0320 07:47:21.179773 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:47:23 crc kubenswrapper[4749]: I0320 07:47:23.178235 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:47:23 crc kubenswrapper[4749]: E0320 07:47:23.179190 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:47:36 crc kubenswrapper[4749]: I0320 07:47:36.178007 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:47:36 crc kubenswrapper[4749]: E0320 07:47:36.180957 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:47:38 crc kubenswrapper[4749]: I0320 07:47:38.177621 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:47:38 crc kubenswrapper[4749]: E0320 07:47:38.177978 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:47:48 crc kubenswrapper[4749]: I0320 07:47:48.177794 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:47:48 crc kubenswrapper[4749]: E0320 07:47:48.178958 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:47:50 crc kubenswrapper[4749]: I0320 07:47:50.177959 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:47:50 crc kubenswrapper[4749]: E0320 07:47:50.179842 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:48:00 crc kubenswrapper[4749]: I0320 07:48:00.160681 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566548-6p4xc"] Mar 20 07:48:00 crc kubenswrapper[4749]: E0320 07:48:00.161503 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bef987c-8038-436d-a79e-c2346c61050b" containerName="oc" Mar 20 07:48:00 crc kubenswrapper[4749]: I0320 07:48:00.161516 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bef987c-8038-436d-a79e-c2346c61050b" containerName="oc" Mar 20 07:48:00 crc kubenswrapper[4749]: I0320 07:48:00.161696 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bef987c-8038-436d-a79e-c2346c61050b" containerName="oc" Mar 20 07:48:00 crc kubenswrapper[4749]: I0320 07:48:00.162184 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566548-6p4xc" Mar 20 07:48:00 crc kubenswrapper[4749]: I0320 07:48:00.163547 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 07:48:00 crc kubenswrapper[4749]: I0320 07:48:00.165028 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:48:00 crc kubenswrapper[4749]: I0320 07:48:00.166608 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:48:00 crc kubenswrapper[4749]: I0320 07:48:00.174075 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566548-6p4xc"] Mar 20 07:48:00 crc kubenswrapper[4749]: I0320 07:48:00.177239 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:48:00 crc kubenswrapper[4749]: E0320 07:48:00.177695 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:48:00 crc kubenswrapper[4749]: I0320 07:48:00.212028 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fht6\" (UniqueName: \"kubernetes.io/projected/b522c08e-c76b-4793-9947-5a5b53b5d5ba-kube-api-access-9fht6\") pod \"auto-csr-approver-29566548-6p4xc\" (UID: \"b522c08e-c76b-4793-9947-5a5b53b5d5ba\") " pod="openshift-infra/auto-csr-approver-29566548-6p4xc" Mar 20 07:48:00 crc kubenswrapper[4749]: I0320 07:48:00.313739 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fht6\" (UniqueName: \"kubernetes.io/projected/b522c08e-c76b-4793-9947-5a5b53b5d5ba-kube-api-access-9fht6\") pod \"auto-csr-approver-29566548-6p4xc\" (UID: \"b522c08e-c76b-4793-9947-5a5b53b5d5ba\") " pod="openshift-infra/auto-csr-approver-29566548-6p4xc" Mar 20 07:48:00 crc kubenswrapper[4749]: I0320 07:48:00.341987 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fht6\" (UniqueName: \"kubernetes.io/projected/b522c08e-c76b-4793-9947-5a5b53b5d5ba-kube-api-access-9fht6\") pod \"auto-csr-approver-29566548-6p4xc\" (UID: \"b522c08e-c76b-4793-9947-5a5b53b5d5ba\") " pod="openshift-infra/auto-csr-approver-29566548-6p4xc" Mar 20 07:48:00 crc kubenswrapper[4749]: I0320 07:48:00.490572 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566548-6p4xc" Mar 20 07:48:00 crc kubenswrapper[4749]: I0320 07:48:00.947621 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566548-6p4xc"] Mar 20 07:48:00 crc kubenswrapper[4749]: W0320 07:48:00.959155 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb522c08e_c76b_4793_9947_5a5b53b5d5ba.slice/crio-779edfac91b2251c335eaeb54493b814498cebff078d006391b6cf8ed6b0ddcc WatchSource:0}: Error finding container 779edfac91b2251c335eaeb54493b814498cebff078d006391b6cf8ed6b0ddcc: Status 404 returned error can't find the container with id 779edfac91b2251c335eaeb54493b814498cebff078d006391b6cf8ed6b0ddcc Mar 20 07:48:01 crc kubenswrapper[4749]: I0320 07:48:01.177305 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:48:01 crc kubenswrapper[4749]: E0320 07:48:01.179061 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:48:01 crc kubenswrapper[4749]: I0320 07:48:01.651570 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566548-6p4xc" event={"ID":"b522c08e-c76b-4793-9947-5a5b53b5d5ba","Type":"ContainerStarted","Data":"779edfac91b2251c335eaeb54493b814498cebff078d006391b6cf8ed6b0ddcc"} Mar 20 07:48:02 crc kubenswrapper[4749]: I0320 07:48:02.147412 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x6flg"] Mar 20 07:48:02 crc kubenswrapper[4749]: I0320 07:48:02.150334 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6flg" Mar 20 07:48:02 crc kubenswrapper[4749]: I0320 07:48:02.162445 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x6flg"] Mar 20 07:48:02 crc kubenswrapper[4749]: I0320 07:48:02.247199 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb44b3f-341a-4a35-b8bb-0976b2abaa38-catalog-content\") pod \"redhat-operators-x6flg\" (UID: \"4eb44b3f-341a-4a35-b8bb-0976b2abaa38\") " pod="openshift-marketplace/redhat-operators-x6flg" Mar 20 07:48:02 crc kubenswrapper[4749]: I0320 07:48:02.247305 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb44b3f-341a-4a35-b8bb-0976b2abaa38-utilities\") pod \"redhat-operators-x6flg\" (UID: \"4eb44b3f-341a-4a35-b8bb-0976b2abaa38\") " pod="openshift-marketplace/redhat-operators-x6flg" Mar 20 07:48:02 crc kubenswrapper[4749]: I0320 07:48:02.247396 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bns2m\" (UniqueName: \"kubernetes.io/projected/4eb44b3f-341a-4a35-b8bb-0976b2abaa38-kube-api-access-bns2m\") pod \"redhat-operators-x6flg\" (UID: \"4eb44b3f-341a-4a35-b8bb-0976b2abaa38\") " pod="openshift-marketplace/redhat-operators-x6flg" Mar 20 07:48:02 crc kubenswrapper[4749]: I0320 07:48:02.348936 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bns2m\" (UniqueName: \"kubernetes.io/projected/4eb44b3f-341a-4a35-b8bb-0976b2abaa38-kube-api-access-bns2m\") pod \"redhat-operators-x6flg\" (UID: \"4eb44b3f-341a-4a35-b8bb-0976b2abaa38\") " pod="openshift-marketplace/redhat-operators-x6flg" Mar 20 07:48:02 crc kubenswrapper[4749]: I0320 07:48:02.349361 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb44b3f-341a-4a35-b8bb-0976b2abaa38-catalog-content\") pod \"redhat-operators-x6flg\" (UID: \"4eb44b3f-341a-4a35-b8bb-0976b2abaa38\") " pod="openshift-marketplace/redhat-operators-x6flg" Mar 20 07:48:02 crc kubenswrapper[4749]: I0320 07:48:02.349419 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb44b3f-341a-4a35-b8bb-0976b2abaa38-utilities\") pod \"redhat-operators-x6flg\" (UID: \"4eb44b3f-341a-4a35-b8bb-0976b2abaa38\") " pod="openshift-marketplace/redhat-operators-x6flg" Mar 20 07:48:02 crc kubenswrapper[4749]: I0320 07:48:02.349813 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb44b3f-341a-4a35-b8bb-0976b2abaa38-catalog-content\") pod \"redhat-operators-x6flg\" (UID: \"4eb44b3f-341a-4a35-b8bb-0976b2abaa38\") " pod="openshift-marketplace/redhat-operators-x6flg" Mar 20 07:48:02 crc kubenswrapper[4749]: I0320 07:48:02.349888 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb44b3f-341a-4a35-b8bb-0976b2abaa38-utilities\") pod \"redhat-operators-x6flg\" (UID: \"4eb44b3f-341a-4a35-b8bb-0976b2abaa38\") " pod="openshift-marketplace/redhat-operators-x6flg" Mar 20 07:48:02 crc kubenswrapper[4749]: I0320 07:48:02.369157 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bns2m\" (UniqueName: \"kubernetes.io/projected/4eb44b3f-341a-4a35-b8bb-0976b2abaa38-kube-api-access-bns2m\") pod \"redhat-operators-x6flg\" (UID: \"4eb44b3f-341a-4a35-b8bb-0976b2abaa38\") " pod="openshift-marketplace/redhat-operators-x6flg" Mar 20 07:48:02 crc kubenswrapper[4749]: I0320 07:48:02.506505 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6flg" Mar 20 07:48:02 crc kubenswrapper[4749]: I0320 07:48:02.660106 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566548-6p4xc" event={"ID":"b522c08e-c76b-4793-9947-5a5b53b5d5ba","Type":"ContainerStarted","Data":"d74c60bc449ab7677872ff0176882e4d21d6538d1fc6893167e08391845666ab"} Mar 20 07:48:02 crc kubenswrapper[4749]: I0320 07:48:02.676192 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566548-6p4xc" podStartSLOduration=1.588366919 podStartE2EDuration="2.676170362s" podCreationTimestamp="2026-03-20 07:48:00 +0000 UTC" firstStartedPulling="2026-03-20 07:48:00.962798234 +0000 UTC m=+2117.512455921" lastFinishedPulling="2026-03-20 07:48:02.050601677 +0000 UTC m=+2118.600259364" observedRunningTime="2026-03-20 07:48:02.673164459 +0000 UTC m=+2119.222822106" watchObservedRunningTime="2026-03-20 07:48:02.676170362 +0000 UTC m=+2119.225828009" Mar 20 07:48:02 crc kubenswrapper[4749]: W0320 07:48:02.983659 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4eb44b3f_341a_4a35_b8bb_0976b2abaa38.slice/crio-1f161befe6efc852844e7471ef484e2ea3ccc24f27e9f32858a3f3ea9152f965 WatchSource:0}: Error finding container 1f161befe6efc852844e7471ef484e2ea3ccc24f27e9f32858a3f3ea9152f965: Status 404 returned error can't find the container with id 1f161befe6efc852844e7471ef484e2ea3ccc24f27e9f32858a3f3ea9152f965 Mar 20 07:48:02 crc kubenswrapper[4749]: I0320 07:48:02.988213 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x6flg"] Mar 20 07:48:03 crc kubenswrapper[4749]: I0320 07:48:03.668684 4749 generic.go:334] "Generic (PLEG): container finished" podID="b522c08e-c76b-4793-9947-5a5b53b5d5ba" containerID="d74c60bc449ab7677872ff0176882e4d21d6538d1fc6893167e08391845666ab" exitCode=0 Mar 20 07:48:03 crc kubenswrapper[4749]: I0320 07:48:03.668770 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566548-6p4xc" event={"ID":"b522c08e-c76b-4793-9947-5a5b53b5d5ba","Type":"ContainerDied","Data":"d74c60bc449ab7677872ff0176882e4d21d6538d1fc6893167e08391845666ab"} Mar 20 07:48:03 crc kubenswrapper[4749]: I0320 07:48:03.671518 4749 generic.go:334] "Generic (PLEG): container finished" podID="4eb44b3f-341a-4a35-b8bb-0976b2abaa38" containerID="0839fcb778925243bb637f93e2113f1c086a3060fbff542cdc62d15d8ed87e95" exitCode=0 Mar 20 07:48:03 crc kubenswrapper[4749]: I0320 07:48:03.671593 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6flg" event={"ID":"4eb44b3f-341a-4a35-b8bb-0976b2abaa38","Type":"ContainerDied","Data":"0839fcb778925243bb637f93e2113f1c086a3060fbff542cdc62d15d8ed87e95"} Mar 20 07:48:03 crc kubenswrapper[4749]: I0320 07:48:03.671628 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6flg" event={"ID":"4eb44b3f-341a-4a35-b8bb-0976b2abaa38","Type":"ContainerStarted","Data":"1f161befe6efc852844e7471ef484e2ea3ccc24f27e9f32858a3f3ea9152f965"} Mar 20 07:48:05 crc kubenswrapper[4749]: I0320 07:48:05.029952 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566548-6p4xc" Mar 20 07:48:05 crc kubenswrapper[4749]: I0320 07:48:05.092190 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fht6\" (UniqueName: \"kubernetes.io/projected/b522c08e-c76b-4793-9947-5a5b53b5d5ba-kube-api-access-9fht6\") pod \"b522c08e-c76b-4793-9947-5a5b53b5d5ba\" (UID: \"b522c08e-c76b-4793-9947-5a5b53b5d5ba\") " Mar 20 07:48:05 crc kubenswrapper[4749]: I0320 07:48:05.097833 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b522c08e-c76b-4793-9947-5a5b53b5d5ba-kube-api-access-9fht6" (OuterVolumeSpecName: "kube-api-access-9fht6") pod "b522c08e-c76b-4793-9947-5a5b53b5d5ba" (UID: "b522c08e-c76b-4793-9947-5a5b53b5d5ba"). InnerVolumeSpecName "kube-api-access-9fht6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:48:05 crc kubenswrapper[4749]: I0320 07:48:05.194012 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fht6\" (UniqueName: \"kubernetes.io/projected/b522c08e-c76b-4793-9947-5a5b53b5d5ba-kube-api-access-9fht6\") on node \"crc\" DevicePath \"\"" Mar 20 07:48:05 crc kubenswrapper[4749]: I0320 07:48:05.690238 4749 generic.go:334] "Generic (PLEG): container finished" podID="4eb44b3f-341a-4a35-b8bb-0976b2abaa38" containerID="0a22b1ec149e6a885d5a7129dc089071a62bcf9ae6302ed6c0fed5e94e5452e1" exitCode=0 Mar 20 07:48:05 crc kubenswrapper[4749]: I0320 07:48:05.690326 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6flg" event={"ID":"4eb44b3f-341a-4a35-b8bb-0976b2abaa38","Type":"ContainerDied","Data":"0a22b1ec149e6a885d5a7129dc089071a62bcf9ae6302ed6c0fed5e94e5452e1"} Mar 20 07:48:05 crc kubenswrapper[4749]: I0320 07:48:05.693421 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566548-6p4xc" event={"ID":"b522c08e-c76b-4793-9947-5a5b53b5d5ba","Type":"ContainerDied","Data":"779edfac91b2251c335eaeb54493b814498cebff078d006391b6cf8ed6b0ddcc"} Mar 20 07:48:05 crc kubenswrapper[4749]: I0320 07:48:05.693460 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="779edfac91b2251c335eaeb54493b814498cebff078d006391b6cf8ed6b0ddcc" Mar 20 07:48:05 crc kubenswrapper[4749]: I0320 07:48:05.693530 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566548-6p4xc" Mar 20 07:48:05 crc kubenswrapper[4749]: I0320 07:48:05.756046 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566542-lps6m"] Mar 20 07:48:05 crc kubenswrapper[4749]: I0320 07:48:05.765778 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566542-lps6m"] Mar 20 07:48:06 crc kubenswrapper[4749]: I0320 07:48:06.192926 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09126e47-94db-4c40-bc68-23606616605d" path="/var/lib/kubelet/pods/09126e47-94db-4c40-bc68-23606616605d/volumes" Mar 20 07:48:06 crc kubenswrapper[4749]: I0320 07:48:06.701807 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6flg" event={"ID":"4eb44b3f-341a-4a35-b8bb-0976b2abaa38","Type":"ContainerStarted","Data":"b391a0a73a9f17e0fa163fe4fbc3fe656be0b393a0c6045037fd59785aaa614d"} Mar 20 07:48:06 crc kubenswrapper[4749]: I0320 07:48:06.727246 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x6flg" podStartSLOduration=2.275228643 podStartE2EDuration="4.727225623s" podCreationTimestamp="2026-03-20 07:48:02 +0000 UTC" firstStartedPulling="2026-03-20 07:48:03.672916853 +0000 UTC m=+2120.222574520" lastFinishedPulling="2026-03-20 07:48:06.124913853 +0000 UTC m=+2122.674571500" observedRunningTime="2026-03-20 07:48:06.720303495 +0000 UTC m=+2123.269961132" watchObservedRunningTime="2026-03-20 07:48:06.727225623 +0000 UTC m=+2123.276883270" Mar 20 07:48:12 crc kubenswrapper[4749]: I0320 07:48:12.178180 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:48:12 crc kubenswrapper[4749]: E0320 07:48:12.179107 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:48:12 crc kubenswrapper[4749]: I0320 07:48:12.507329 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x6flg" Mar 20 07:48:12 crc kubenswrapper[4749]: I0320 07:48:12.507387 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x6flg" Mar 20 07:48:13 crc kubenswrapper[4749]: I0320 07:48:13.549718 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-x6flg" podUID="4eb44b3f-341a-4a35-b8bb-0976b2abaa38" containerName="registry-server" probeResult="failure" output=< Mar 20 07:48:13 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Mar 20 07:48:13 crc kubenswrapper[4749]: > Mar 20 07:48:16 crc kubenswrapper[4749]: I0320 07:48:16.177984 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:48:16 crc kubenswrapper[4749]: E0320 07:48:16.178797 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:48:22 crc kubenswrapper[4749]: I0320 07:48:22.551325 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x6flg" Mar 20 07:48:22 crc kubenswrapper[4749]: I0320 07:48:22.604962 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x6flg" Mar 20 07:48:22 crc kubenswrapper[4749]: I0320 07:48:22.787646 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x6flg"] Mar 20 07:48:23 crc kubenswrapper[4749]: I0320 07:48:23.177749 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:48:23 crc kubenswrapper[4749]: E0320 07:48:23.178764 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:48:23 crc kubenswrapper[4749]: I0320 07:48:23.858248 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x6flg" podUID="4eb44b3f-341a-4a35-b8bb-0976b2abaa38" containerName="registry-server" containerID="cri-o://b391a0a73a9f17e0fa163fe4fbc3fe656be0b393a0c6045037fd59785aaa614d" gracePeriod=2 Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.314992 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6flg" Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.386172 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bns2m\" (UniqueName: \"kubernetes.io/projected/4eb44b3f-341a-4a35-b8bb-0976b2abaa38-kube-api-access-bns2m\") pod \"4eb44b3f-341a-4a35-b8bb-0976b2abaa38\" (UID: \"4eb44b3f-341a-4a35-b8bb-0976b2abaa38\") " Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.386299 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb44b3f-341a-4a35-b8bb-0976b2abaa38-catalog-content\") pod \"4eb44b3f-341a-4a35-b8bb-0976b2abaa38\" (UID: \"4eb44b3f-341a-4a35-b8bb-0976b2abaa38\") " Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.386484 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb44b3f-341a-4a35-b8bb-0976b2abaa38-utilities\") pod \"4eb44b3f-341a-4a35-b8bb-0976b2abaa38\" (UID: \"4eb44b3f-341a-4a35-b8bb-0976b2abaa38\") " Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.387875 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eb44b3f-341a-4a35-b8bb-0976b2abaa38-utilities" (OuterVolumeSpecName: "utilities") pod "4eb44b3f-341a-4a35-b8bb-0976b2abaa38" (UID: "4eb44b3f-341a-4a35-b8bb-0976b2abaa38"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.393730 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eb44b3f-341a-4a35-b8bb-0976b2abaa38-kube-api-access-bns2m" (OuterVolumeSpecName: "kube-api-access-bns2m") pod "4eb44b3f-341a-4a35-b8bb-0976b2abaa38" (UID: "4eb44b3f-341a-4a35-b8bb-0976b2abaa38"). InnerVolumeSpecName "kube-api-access-bns2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.489260 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb44b3f-341a-4a35-b8bb-0976b2abaa38-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.489335 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bns2m\" (UniqueName: \"kubernetes.io/projected/4eb44b3f-341a-4a35-b8bb-0976b2abaa38-kube-api-access-bns2m\") on node \"crc\" DevicePath \"\"" Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.548176 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eb44b3f-341a-4a35-b8bb-0976b2abaa38-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4eb44b3f-341a-4a35-b8bb-0976b2abaa38" (UID: "4eb44b3f-341a-4a35-b8bb-0976b2abaa38"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.593408 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb44b3f-341a-4a35-b8bb-0976b2abaa38-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.869673 4749 generic.go:334] "Generic (PLEG): container finished" podID="4eb44b3f-341a-4a35-b8bb-0976b2abaa38" containerID="b391a0a73a9f17e0fa163fe4fbc3fe656be0b393a0c6045037fd59785aaa614d" exitCode=0 Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.869758 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6flg" event={"ID":"4eb44b3f-341a-4a35-b8bb-0976b2abaa38","Type":"ContainerDied","Data":"b391a0a73a9f17e0fa163fe4fbc3fe656be0b393a0c6045037fd59785aaa614d"} Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.869811 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6flg" event={"ID":"4eb44b3f-341a-4a35-b8bb-0976b2abaa38","Type":"ContainerDied","Data":"1f161befe6efc852844e7471ef484e2ea3ccc24f27e9f32858a3f3ea9152f965"} Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.869831 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6flg" Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.869850 4749 scope.go:117] "RemoveContainer" containerID="b391a0a73a9f17e0fa163fe4fbc3fe656be0b393a0c6045037fd59785aaa614d" Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.908378 4749 scope.go:117] "RemoveContainer" containerID="0a22b1ec149e6a885d5a7129dc089071a62bcf9ae6302ed6c0fed5e94e5452e1" Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.930072 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x6flg"] Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.939258 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x6flg"] Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.955982 4749 scope.go:117] "RemoveContainer" containerID="0839fcb778925243bb637f93e2113f1c086a3060fbff542cdc62d15d8ed87e95" Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.988431 4749 scope.go:117] "RemoveContainer" containerID="b391a0a73a9f17e0fa163fe4fbc3fe656be0b393a0c6045037fd59785aaa614d" Mar 20 07:48:24 crc kubenswrapper[4749]: E0320 07:48:24.991166 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b391a0a73a9f17e0fa163fe4fbc3fe656be0b393a0c6045037fd59785aaa614d\": container with ID starting with b391a0a73a9f17e0fa163fe4fbc3fe656be0b393a0c6045037fd59785aaa614d not found: ID does not exist" containerID="b391a0a73a9f17e0fa163fe4fbc3fe656be0b393a0c6045037fd59785aaa614d" Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.991198 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b391a0a73a9f17e0fa163fe4fbc3fe656be0b393a0c6045037fd59785aaa614d"} err="failed to get container status \"b391a0a73a9f17e0fa163fe4fbc3fe656be0b393a0c6045037fd59785aaa614d\": rpc error: code = NotFound desc = could not find container \"b391a0a73a9f17e0fa163fe4fbc3fe656be0b393a0c6045037fd59785aaa614d\": container with ID starting with b391a0a73a9f17e0fa163fe4fbc3fe656be0b393a0c6045037fd59785aaa614d not found: ID does not exist" Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.991219 4749 scope.go:117] "RemoveContainer" containerID="0a22b1ec149e6a885d5a7129dc089071a62bcf9ae6302ed6c0fed5e94e5452e1" Mar 20 07:48:24 crc kubenswrapper[4749]: E0320 07:48:24.991844 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a22b1ec149e6a885d5a7129dc089071a62bcf9ae6302ed6c0fed5e94e5452e1\": container with ID starting with 0a22b1ec149e6a885d5a7129dc089071a62bcf9ae6302ed6c0fed5e94e5452e1 not found: ID does not exist" containerID="0a22b1ec149e6a885d5a7129dc089071a62bcf9ae6302ed6c0fed5e94e5452e1" Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.991890 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a22b1ec149e6a885d5a7129dc089071a62bcf9ae6302ed6c0fed5e94e5452e1"} err="failed to get container status \"0a22b1ec149e6a885d5a7129dc089071a62bcf9ae6302ed6c0fed5e94e5452e1\": rpc error: code = NotFound desc = could not find container \"0a22b1ec149e6a885d5a7129dc089071a62bcf9ae6302ed6c0fed5e94e5452e1\": container with ID starting with 0a22b1ec149e6a885d5a7129dc089071a62bcf9ae6302ed6c0fed5e94e5452e1 not found: ID does not exist" Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.991918 4749 scope.go:117] "RemoveContainer" containerID="0839fcb778925243bb637f93e2113f1c086a3060fbff542cdc62d15d8ed87e95" Mar 20 07:48:24 crc kubenswrapper[4749]: E0320 07:48:24.992191 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0839fcb778925243bb637f93e2113f1c086a3060fbff542cdc62d15d8ed87e95\": container with ID starting with 0839fcb778925243bb637f93e2113f1c086a3060fbff542cdc62d15d8ed87e95 not found: ID does not exist" containerID="0839fcb778925243bb637f93e2113f1c086a3060fbff542cdc62d15d8ed87e95" Mar 20 07:48:24 crc kubenswrapper[4749]: I0320 07:48:24.992218 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0839fcb778925243bb637f93e2113f1c086a3060fbff542cdc62d15d8ed87e95"} err="failed to get container status \"0839fcb778925243bb637f93e2113f1c086a3060fbff542cdc62d15d8ed87e95\": rpc error: code = NotFound desc = could not find container \"0839fcb778925243bb637f93e2113f1c086a3060fbff542cdc62d15d8ed87e95\": container with ID starting with 0839fcb778925243bb637f93e2113f1c086a3060fbff542cdc62d15d8ed87e95 not found: ID does not exist" Mar 20 07:48:26 crc kubenswrapper[4749]: I0320 07:48:26.194687 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eb44b3f-341a-4a35-b8bb-0976b2abaa38" path="/var/lib/kubelet/pods/4eb44b3f-341a-4a35-b8bb-0976b2abaa38/volumes" Mar 20 07:48:30 crc kubenswrapper[4749]: I0320 07:48:30.178673 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:48:30 crc kubenswrapper[4749]: E0320 07:48:30.179564 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:48:37 crc kubenswrapper[4749]: I0320 07:48:37.178428 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:48:38 crc kubenswrapper[4749]: I0320 07:48:38.005021 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerStarted","Data":"6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec"} Mar 20 07:48:38 crc kubenswrapper[4749]: I0320 07:48:38.005598 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:48:41 crc kubenswrapper[4749]: I0320 07:48:41.177379 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:48:42 crc kubenswrapper[4749]: I0320 07:48:42.049717 4749 generic.go:334] "Generic (PLEG): container finished" podID="8b9b402f-2d95-48f5-98d8-497d90956ba2" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" exitCode=0 Mar 20 07:48:42 crc kubenswrapper[4749]: I0320 07:48:42.049796 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerDied","Data":"6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec"} Mar 20 07:48:42 crc kubenswrapper[4749]: I0320 07:48:42.049995 4749 scope.go:117] "RemoveContainer" containerID="0c0b95b9a274418bdf2e130672c84aaa5cd8171d0fefd670ec6a651638e9d158" Mar 20 07:48:42 crc kubenswrapper[4749]: I0320 07:48:42.051187 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:48:42 crc kubenswrapper[4749]: E0320 07:48:42.051817 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:48:42 crc kubenswrapper[4749]: I0320 07:48:42.054194 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerStarted","Data":"364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851"} Mar 20 07:48:42 crc kubenswrapper[4749]: I0320 07:48:42.055201 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 07:48:42 crc kubenswrapper[4749]: I0320 07:48:42.765057 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p9d7m"] Mar 20 07:48:42 crc kubenswrapper[4749]: E0320 07:48:42.765615 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b522c08e-c76b-4793-9947-5a5b53b5d5ba" containerName="oc" Mar 20 07:48:42 crc kubenswrapper[4749]: I0320 07:48:42.765638 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b522c08e-c76b-4793-9947-5a5b53b5d5ba" containerName="oc" Mar 20 07:48:42 crc kubenswrapper[4749]: E0320 07:48:42.765690 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb44b3f-341a-4a35-b8bb-0976b2abaa38" containerName="extract-utilities" Mar 20 07:48:42 crc kubenswrapper[4749]: I0320 07:48:42.765703 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb44b3f-341a-4a35-b8bb-0976b2abaa38" containerName="extract-utilities" Mar 20 07:48:42 crc kubenswrapper[4749]: E0320 07:48:42.765726 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb44b3f-341a-4a35-b8bb-0976b2abaa38" containerName="registry-server" Mar 20 07:48:42 crc kubenswrapper[4749]: I0320 07:48:42.765741 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb44b3f-341a-4a35-b8bb-0976b2abaa38" containerName="registry-server" Mar 20 07:48:42 crc kubenswrapper[4749]: E0320 07:48:42.765775 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb44b3f-341a-4a35-b8bb-0976b2abaa38" containerName="extract-content" Mar 20 07:48:42 crc kubenswrapper[4749]: I0320 07:48:42.765789 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb44b3f-341a-4a35-b8bb-0976b2abaa38" containerName="extract-content" Mar 20 07:48:42 crc kubenswrapper[4749]: I0320 07:48:42.766065 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b522c08e-c76b-4793-9947-5a5b53b5d5ba" containerName="oc" Mar 20 07:48:42 crc kubenswrapper[4749]: I0320 07:48:42.766095 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eb44b3f-341a-4a35-b8bb-0976b2abaa38" containerName="registry-server" Mar 20 07:48:42 crc kubenswrapper[4749]: I0320 07:48:42.768443 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p9d7m" Mar 20 07:48:42 crc kubenswrapper[4749]: I0320 07:48:42.785362 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p9d7m"] Mar 20 07:48:42 crc kubenswrapper[4749]: I0320 07:48:42.923092 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69af4d7d-d164-4541-b2cf-edc3ce20af02-utilities\") pod \"community-operators-p9d7m\" (UID: \"69af4d7d-d164-4541-b2cf-edc3ce20af02\") " pod="openshift-marketplace/community-operators-p9d7m" Mar 20 07:48:42 crc kubenswrapper[4749]: I0320 07:48:42.923501 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69af4d7d-d164-4541-b2cf-edc3ce20af02-catalog-content\") pod \"community-operators-p9d7m\" (UID: \"69af4d7d-d164-4541-b2cf-edc3ce20af02\") " pod="openshift-marketplace/community-operators-p9d7m" Mar 20 07:48:42 crc kubenswrapper[4749]: I0320 07:48:42.923662 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c27gq\" (UniqueName: \"kubernetes.io/projected/69af4d7d-d164-4541-b2cf-edc3ce20af02-kube-api-access-c27gq\") pod \"community-operators-p9d7m\" (UID: \"69af4d7d-d164-4541-b2cf-edc3ce20af02\") " pod="openshift-marketplace/community-operators-p9d7m" Mar 20 07:48:43 crc kubenswrapper[4749]: I0320 07:48:43.024771 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69af4d7d-d164-4541-b2cf-edc3ce20af02-utilities\") pod \"community-operators-p9d7m\" (UID: \"69af4d7d-d164-4541-b2cf-edc3ce20af02\") " pod="openshift-marketplace/community-operators-p9d7m" Mar 20 07:48:43 crc kubenswrapper[4749]: I0320 07:48:43.024818 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69af4d7d-d164-4541-b2cf-edc3ce20af02-catalog-content\") pod \"community-operators-p9d7m\" (UID: \"69af4d7d-d164-4541-b2cf-edc3ce20af02\") " pod="openshift-marketplace/community-operators-p9d7m" Mar 20 07:48:43 crc kubenswrapper[4749]: I0320 07:48:43.024898 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c27gq\" (UniqueName: \"kubernetes.io/projected/69af4d7d-d164-4541-b2cf-edc3ce20af02-kube-api-access-c27gq\") pod \"community-operators-p9d7m\" (UID: \"69af4d7d-d164-4541-b2cf-edc3ce20af02\") " pod="openshift-marketplace/community-operators-p9d7m" Mar 20 07:48:43 crc kubenswrapper[4749]: I0320 07:48:43.025777 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69af4d7d-d164-4541-b2cf-edc3ce20af02-utilities\") pod \"community-operators-p9d7m\" (UID: \"69af4d7d-d164-4541-b2cf-edc3ce20af02\") " pod="openshift-marketplace/community-operators-p9d7m" Mar 20 07:48:43 crc kubenswrapper[4749]: I0320 07:48:43.025844 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69af4d7d-d164-4541-b2cf-edc3ce20af02-catalog-content\") pod \"community-operators-p9d7m\" (UID: \"69af4d7d-d164-4541-b2cf-edc3ce20af02\") " pod="openshift-marketplace/community-operators-p9d7m" Mar 20 07:48:43 crc kubenswrapper[4749]: I0320 07:48:43.048729 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c27gq\" (UniqueName: \"kubernetes.io/projected/69af4d7d-d164-4541-b2cf-edc3ce20af02-kube-api-access-c27gq\") pod \"community-operators-p9d7m\" (UID: \"69af4d7d-d164-4541-b2cf-edc3ce20af02\") " pod="openshift-marketplace/community-operators-p9d7m" Mar 20 07:48:43 crc kubenswrapper[4749]: I0320 07:48:43.119884 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p9d7m" Mar 20 07:48:43 crc kubenswrapper[4749]: W0320 07:48:43.622011 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69af4d7d_d164_4541_b2cf_edc3ce20af02.slice/crio-4bae01a9af4417761f1d6625863b3ee7c049cc442eafc7c7c27223c0662eed11 WatchSource:0}: Error finding container 4bae01a9af4417761f1d6625863b3ee7c049cc442eafc7c7c27223c0662eed11: Status 404 returned error can't find the container with id 4bae01a9af4417761f1d6625863b3ee7c049cc442eafc7c7c27223c0662eed11 Mar 20 07:48:43 crc kubenswrapper[4749]: I0320 07:48:43.622355 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p9d7m"] Mar 20 07:48:44 crc kubenswrapper[4749]: I0320 07:48:44.075102 4749 generic.go:334] "Generic (PLEG): container finished" podID="69af4d7d-d164-4541-b2cf-edc3ce20af02" containerID="f641422b22099ca23e8ff29556474bf791860887f5660732e93b48b31b7f6a37" exitCode=0 Mar 20 07:48:44 crc kubenswrapper[4749]: I0320 07:48:44.075159 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9d7m" event={"ID":"69af4d7d-d164-4541-b2cf-edc3ce20af02","Type":"ContainerDied","Data":"f641422b22099ca23e8ff29556474bf791860887f5660732e93b48b31b7f6a37"} Mar 20 07:48:44 crc kubenswrapper[4749]: I0320 07:48:44.075214 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9d7m" event={"ID":"69af4d7d-d164-4541-b2cf-edc3ce20af02","Type":"ContainerStarted","Data":"4bae01a9af4417761f1d6625863b3ee7c049cc442eafc7c7c27223c0662eed11"} Mar 20 07:48:46 crc kubenswrapper[4749]: I0320 07:48:46.090217 4749 generic.go:334] "Generic (PLEG): container finished" podID="8db06e36-0b00-4157-9345-69449da3e85f" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" exitCode=0 Mar 20 07:48:46 crc kubenswrapper[4749]: I0320 07:48:46.090320 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerDied","Data":"364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851"} Mar 20 07:48:46 crc kubenswrapper[4749]: I0320 07:48:46.090495 4749 scope.go:117] "RemoveContainer" containerID="630f08807190c10ee7819d4aaeefeeb8553739d4a57777808e3b8461de8a1e8c" Mar 20 07:48:46 crc kubenswrapper[4749]: I0320 07:48:46.091685 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:48:46 crc kubenswrapper[4749]: E0320 07:48:46.093672 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:48:48 crc kubenswrapper[4749]: I0320 07:48:48.114878 4749 generic.go:334] "Generic (PLEG): container finished" podID="69af4d7d-d164-4541-b2cf-edc3ce20af02" containerID="7e03f25911547efdd254f0e16cacbcd69a277238da1931289621fb7ee891aacb" exitCode=0 Mar 20 07:48:48 crc kubenswrapper[4749]: I0320 07:48:48.115260 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9d7m" event={"ID":"69af4d7d-d164-4541-b2cf-edc3ce20af02","Type":"ContainerDied","Data":"7e03f25911547efdd254f0e16cacbcd69a277238da1931289621fb7ee891aacb"} Mar 20 07:48:49 crc kubenswrapper[4749]: I0320 07:48:49.128338 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9d7m" event={"ID":"69af4d7d-d164-4541-b2cf-edc3ce20af02","Type":"ContainerStarted","Data":"4f53ee8c83d83fa60813c86d1c12d578cc51cc79992a86b12269ec0bfc37b1a4"} Mar 20 07:48:49 crc kubenswrapper[4749]: I0320 07:48:49.161888 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p9d7m" podStartSLOduration=2.6940246610000003 podStartE2EDuration="7.161858326s" podCreationTimestamp="2026-03-20 07:48:42 +0000 UTC" firstStartedPulling="2026-03-20 07:48:44.077799103 +0000 UTC m=+2160.627456750" lastFinishedPulling="2026-03-20 07:48:48.545632768 +0000 UTC m=+2165.095290415" observedRunningTime="2026-03-20 07:48:49.155518462 +0000 UTC m=+2165.705176189" watchObservedRunningTime="2026-03-20 07:48:49.161858326 +0000 UTC m=+2165.711516003" Mar 20 07:48:53 crc kubenswrapper[4749]: I0320 07:48:53.120585 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p9d7m" Mar 20 07:48:53 crc kubenswrapper[4749]: I0320 07:48:53.121203 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p9d7m" Mar 20 07:48:53 crc kubenswrapper[4749]: I0320 07:48:53.177697 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:48:53 crc kubenswrapper[4749]: E0320 07:48:53.178313 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:48:53 crc kubenswrapper[4749]: I0320 07:48:53.183580 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p9d7m" Mar 20 07:48:53 crc kubenswrapper[4749]: I0320 07:48:53.247274 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p9d7m" Mar 20 07:48:53 crc kubenswrapper[4749]: I0320 07:48:53.332565 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p9d7m"] Mar 20 07:48:53 crc kubenswrapper[4749]: I0320 07:48:53.441975 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-stfh9"] Mar 20 07:48:53 crc kubenswrapper[4749]: I0320 07:48:53.442237 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-stfh9" podUID="6a3930fe-a227-4dbd-82ec-d9e95f06a317" containerName="registry-server" containerID="cri-o://f6b2fac2e461e50f05cef1faab4c29fdb7e31db074c484c4e0da070c7486bfb3" gracePeriod=2 Mar 20 07:48:53 crc kubenswrapper[4749]: I0320 07:48:53.877729 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-stfh9" Mar 20 07:48:53 crc kubenswrapper[4749]: I0320 07:48:53.913687 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3930fe-a227-4dbd-82ec-d9e95f06a317-utilities\") pod \"6a3930fe-a227-4dbd-82ec-d9e95f06a317\" (UID: \"6a3930fe-a227-4dbd-82ec-d9e95f06a317\") " Mar 20 07:48:53 crc kubenswrapper[4749]: I0320 07:48:53.913782 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2pvp\" (UniqueName: \"kubernetes.io/projected/6a3930fe-a227-4dbd-82ec-d9e95f06a317-kube-api-access-b2pvp\") pod \"6a3930fe-a227-4dbd-82ec-d9e95f06a317\" (UID: \"6a3930fe-a227-4dbd-82ec-d9e95f06a317\") " Mar 20 07:48:53 crc kubenswrapper[4749]: I0320 07:48:53.913843 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3930fe-a227-4dbd-82ec-d9e95f06a317-catalog-content\") pod \"6a3930fe-a227-4dbd-82ec-d9e95f06a317\" (UID: \"6a3930fe-a227-4dbd-82ec-d9e95f06a317\") " Mar 20 07:48:53 crc kubenswrapper[4749]: I0320 07:48:53.916891 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3930fe-a227-4dbd-82ec-d9e95f06a317-utilities" (OuterVolumeSpecName: "utilities") pod "6a3930fe-a227-4dbd-82ec-d9e95f06a317" (UID: "6a3930fe-a227-4dbd-82ec-d9e95f06a317"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:48:53 crc kubenswrapper[4749]: I0320 07:48:53.936034 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a3930fe-a227-4dbd-82ec-d9e95f06a317-kube-api-access-b2pvp" (OuterVolumeSpecName: "kube-api-access-b2pvp") pod "6a3930fe-a227-4dbd-82ec-d9e95f06a317" (UID: "6a3930fe-a227-4dbd-82ec-d9e95f06a317"). InnerVolumeSpecName "kube-api-access-b2pvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:48:53 crc kubenswrapper[4749]: I0320 07:48:53.963773 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3930fe-a227-4dbd-82ec-d9e95f06a317-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a3930fe-a227-4dbd-82ec-d9e95f06a317" (UID: "6a3930fe-a227-4dbd-82ec-d9e95f06a317"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:48:54 crc kubenswrapper[4749]: I0320 07:48:54.016227 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3930fe-a227-4dbd-82ec-d9e95f06a317-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:48:54 crc kubenswrapper[4749]: I0320 07:48:54.016275 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2pvp\" (UniqueName: \"kubernetes.io/projected/6a3930fe-a227-4dbd-82ec-d9e95f06a317-kube-api-access-b2pvp\") on node \"crc\" DevicePath \"\"" Mar 20 07:48:54 crc kubenswrapper[4749]: I0320 07:48:54.016306 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3930fe-a227-4dbd-82ec-d9e95f06a317-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:48:54 crc kubenswrapper[4749]: I0320 07:48:54.174404 4749 generic.go:334] "Generic (PLEG): container finished" podID="6a3930fe-a227-4dbd-82ec-d9e95f06a317" containerID="f6b2fac2e461e50f05cef1faab4c29fdb7e31db074c484c4e0da070c7486bfb3" exitCode=0 Mar 20 07:48:54 crc kubenswrapper[4749]: I0320 07:48:54.174445 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-stfh9" event={"ID":"6a3930fe-a227-4dbd-82ec-d9e95f06a317","Type":"ContainerDied","Data":"f6b2fac2e461e50f05cef1faab4c29fdb7e31db074c484c4e0da070c7486bfb3"} Mar 20 07:48:54 crc kubenswrapper[4749]: I0320 07:48:54.174470 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-stfh9" Mar 20 07:48:54 crc kubenswrapper[4749]: I0320 07:48:54.174750 4749 scope.go:117] "RemoveContainer" containerID="f6b2fac2e461e50f05cef1faab4c29fdb7e31db074c484c4e0da070c7486bfb3" Mar 20 07:48:54 crc kubenswrapper[4749]: I0320 07:48:54.174737 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-stfh9" event={"ID":"6a3930fe-a227-4dbd-82ec-d9e95f06a317","Type":"ContainerDied","Data":"3eb2f72792ab96572facf7e27d140b48614adf25ba8d46a827a9656553982724"} Mar 20 07:48:54 crc kubenswrapper[4749]: I0320 07:48:54.196961 4749 scope.go:117] "RemoveContainer" containerID="3359aae1a50aa53f1c3e07fb925baadb6e1528b9841ba2771de24abd137f0b7b" Mar 20 07:48:54 crc kubenswrapper[4749]: I0320 07:48:54.204023 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-stfh9"] Mar 20 07:48:54 crc kubenswrapper[4749]: I0320 07:48:54.216257 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-stfh9"] Mar 20 07:48:54 crc kubenswrapper[4749]: I0320 07:48:54.233600 4749 scope.go:117] "RemoveContainer" containerID="bcf72696bf3ecaf814e13551a98eae22a7b100f381e250fe1783c5c42afc09b4" Mar 20 07:48:54 crc kubenswrapper[4749]: I0320 07:48:54.250760 4749 scope.go:117] "RemoveContainer" containerID="f6b2fac2e461e50f05cef1faab4c29fdb7e31db074c484c4e0da070c7486bfb3" Mar 20 07:48:54 crc kubenswrapper[4749]: E0320 07:48:54.251183 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6b2fac2e461e50f05cef1faab4c29fdb7e31db074c484c4e0da070c7486bfb3\": container with ID starting with f6b2fac2e461e50f05cef1faab4c29fdb7e31db074c484c4e0da070c7486bfb3 not found: ID does not exist" containerID="f6b2fac2e461e50f05cef1faab4c29fdb7e31db074c484c4e0da070c7486bfb3" Mar 20 07:48:54 crc kubenswrapper[4749]: I0320 07:48:54.251236 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b2fac2e461e50f05cef1faab4c29fdb7e31db074c484c4e0da070c7486bfb3"} err="failed to get container status \"f6b2fac2e461e50f05cef1faab4c29fdb7e31db074c484c4e0da070c7486bfb3\": rpc error: code = NotFound desc = could not find container \"f6b2fac2e461e50f05cef1faab4c29fdb7e31db074c484c4e0da070c7486bfb3\": container with ID starting with f6b2fac2e461e50f05cef1faab4c29fdb7e31db074c484c4e0da070c7486bfb3 not found: ID does not exist" Mar 20 07:48:54 crc kubenswrapper[4749]: I0320 07:48:54.251262 4749 scope.go:117] "RemoveContainer" containerID="3359aae1a50aa53f1c3e07fb925baadb6e1528b9841ba2771de24abd137f0b7b" Mar 20 07:48:54 crc kubenswrapper[4749]: E0320 07:48:54.251659 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3359aae1a50aa53f1c3e07fb925baadb6e1528b9841ba2771de24abd137f0b7b\": container with ID starting with 3359aae1a50aa53f1c3e07fb925baadb6e1528b9841ba2771de24abd137f0b7b not found: ID does not exist" containerID="3359aae1a50aa53f1c3e07fb925baadb6e1528b9841ba2771de24abd137f0b7b" Mar 20 07:48:54 crc kubenswrapper[4749]: I0320 07:48:54.251707 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3359aae1a50aa53f1c3e07fb925baadb6e1528b9841ba2771de24abd137f0b7b"} err="failed to get container status \"3359aae1a50aa53f1c3e07fb925baadb6e1528b9841ba2771de24abd137f0b7b\": rpc error: code = NotFound desc = could not find container \"3359aae1a50aa53f1c3e07fb925baadb6e1528b9841ba2771de24abd137f0b7b\": container with ID starting with 3359aae1a50aa53f1c3e07fb925baadb6e1528b9841ba2771de24abd137f0b7b not found: ID does not exist" Mar 20 07:48:54 crc kubenswrapper[4749]: I0320 07:48:54.251740 4749 scope.go:117] "RemoveContainer" containerID="bcf72696bf3ecaf814e13551a98eae22a7b100f381e250fe1783c5c42afc09b4" Mar 20 07:48:54 crc kubenswrapper[4749]: E0320 07:48:54.252107 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcf72696bf3ecaf814e13551a98eae22a7b100f381e250fe1783c5c42afc09b4\": container with ID starting with bcf72696bf3ecaf814e13551a98eae22a7b100f381e250fe1783c5c42afc09b4 not found: ID does not exist" containerID="bcf72696bf3ecaf814e13551a98eae22a7b100f381e250fe1783c5c42afc09b4" Mar 20 07:48:54 crc kubenswrapper[4749]: I0320 07:48:54.252156 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcf72696bf3ecaf814e13551a98eae22a7b100f381e250fe1783c5c42afc09b4"} err="failed to get container status \"bcf72696bf3ecaf814e13551a98eae22a7b100f381e250fe1783c5c42afc09b4\": rpc error: code = NotFound desc = could not find container \"bcf72696bf3ecaf814e13551a98eae22a7b100f381e250fe1783c5c42afc09b4\": container with ID starting with bcf72696bf3ecaf814e13551a98eae22a7b100f381e250fe1783c5c42afc09b4 not found: ID does not exist" Mar 20 07:48:55 crc kubenswrapper[4749]: I0320 07:48:55.816612 4749 scope.go:117] "RemoveContainer" containerID="533bc999252f7f2716fcb93fcc15d78088e935ee8e2798a10a088a312763c003" Mar 20 07:48:56 crc kubenswrapper[4749]: I0320 07:48:56.177223 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:48:56 crc kubenswrapper[4749]: E0320 07:48:56.177712 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:48:56 crc kubenswrapper[4749]: I0320 07:48:56.195555 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a3930fe-a227-4dbd-82ec-d9e95f06a317" path="/var/lib/kubelet/pods/6a3930fe-a227-4dbd-82ec-d9e95f06a317/volumes" Mar 20 07:49:04 crc kubenswrapper[4749]: I0320 07:49:04.187979 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:49:04 crc kubenswrapper[4749]: E0320 07:49:04.189072 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:49:04 crc kubenswrapper[4749]: I0320 07:49:04.514547 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:49:04 crc kubenswrapper[4749]: I0320 07:49:04.514639 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:49:07 crc kubenswrapper[4749]: I0320 07:49:07.177655 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:49:07 crc kubenswrapper[4749]: E0320 07:49:07.178224 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:49:18 crc kubenswrapper[4749]: I0320 07:49:18.178499 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:49:18 crc kubenswrapper[4749]: E0320 07:49:18.179623 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:49:19 crc kubenswrapper[4749]: I0320 07:49:19.178090 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:49:19 crc kubenswrapper[4749]: E0320 07:49:19.178802 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:49:32 crc kubenswrapper[4749]: I0320 07:49:32.178584 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:49:32 crc kubenswrapper[4749]: E0320 07:49:32.179842 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:49:33 crc kubenswrapper[4749]: I0320 07:49:33.177750 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:49:33 crc kubenswrapper[4749]: E0320 07:49:33.177993 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:49:34 crc kubenswrapper[4749]: I0320 07:49:34.515046 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:49:34 crc kubenswrapper[4749]: I0320 07:49:34.515501 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:49:44 crc kubenswrapper[4749]: I0320 07:49:44.187790 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:49:44 crc kubenswrapper[4749]: E0320 07:49:44.188778 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:49:45 crc kubenswrapper[4749]: I0320 07:49:45.177442 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:49:45 crc kubenswrapper[4749]: E0320 07:49:45.177791 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:49:57 crc kubenswrapper[4749]: I0320 07:49:57.178612 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:49:57 crc kubenswrapper[4749]: E0320 07:49:57.179522 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:49:58 crc kubenswrapper[4749]: I0320 07:49:58.177796 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:49:58 crc kubenswrapper[4749]: E0320 07:49:58.178165 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:50:00 crc kubenswrapper[4749]: I0320 07:50:00.165133 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566550-54kkc"] Mar 20 07:50:00 crc kubenswrapper[4749]: E0320 07:50:00.166068 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3930fe-a227-4dbd-82ec-d9e95f06a317" containerName="registry-server" Mar 20 07:50:00 crc kubenswrapper[4749]: I0320 07:50:00.166089 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3930fe-a227-4dbd-82ec-d9e95f06a317" containerName="registry-server" Mar 20 07:50:00 crc kubenswrapper[4749]: E0320 07:50:00.166119 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3930fe-a227-4dbd-82ec-d9e95f06a317" containerName="extract-content" Mar 20 07:50:00 crc kubenswrapper[4749]: I0320 07:50:00.166131 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3930fe-a227-4dbd-82ec-d9e95f06a317" containerName="extract-content" Mar 20 07:50:00 crc kubenswrapper[4749]: E0320 07:50:00.166165 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3930fe-a227-4dbd-82ec-d9e95f06a317" containerName="extract-utilities" Mar 20 07:50:00 crc kubenswrapper[4749]: I0320 07:50:00.166178 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3930fe-a227-4dbd-82ec-d9e95f06a317" containerName="extract-utilities" Mar 20 07:50:00 crc kubenswrapper[4749]: I0320 07:50:00.166497 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a3930fe-a227-4dbd-82ec-d9e95f06a317" containerName="registry-server" Mar 20 07:50:00 crc kubenswrapper[4749]: I0320 07:50:00.167316 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566550-54kkc" Mar 20 07:50:00 crc kubenswrapper[4749]: I0320 07:50:00.171690 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:50:00 crc kubenswrapper[4749]: I0320 07:50:00.171984 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:50:00 crc kubenswrapper[4749]: I0320 07:50:00.172133 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 07:50:00 crc kubenswrapper[4749]: I0320 07:50:00.240622 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566550-54kkc"] Mar 20 07:50:00 crc kubenswrapper[4749]: I0320 07:50:00.260924 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsstt\" (UniqueName: \"kubernetes.io/projected/b925be05-c5d6-40bf-974b-2964fdc28a95-kube-api-access-jsstt\") pod \"auto-csr-approver-29566550-54kkc\" (UID: \"b925be05-c5d6-40bf-974b-2964fdc28a95\") " pod="openshift-infra/auto-csr-approver-29566550-54kkc" Mar 20 07:50:00 crc kubenswrapper[4749]: I0320 07:50:00.362705 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsstt\" (UniqueName: \"kubernetes.io/projected/b925be05-c5d6-40bf-974b-2964fdc28a95-kube-api-access-jsstt\") pod \"auto-csr-approver-29566550-54kkc\" (UID: \"b925be05-c5d6-40bf-974b-2964fdc28a95\") " pod="openshift-infra/auto-csr-approver-29566550-54kkc" Mar 20 07:50:00 crc kubenswrapper[4749]: I0320 07:50:00.386421 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsstt\" (UniqueName: \"kubernetes.io/projected/b925be05-c5d6-40bf-974b-2964fdc28a95-kube-api-access-jsstt\") pod \"auto-csr-approver-29566550-54kkc\" (UID: \"b925be05-c5d6-40bf-974b-2964fdc28a95\") " pod="openshift-infra/auto-csr-approver-29566550-54kkc" Mar 20 07:50:00 crc kubenswrapper[4749]: I0320 07:50:00.541325 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566550-54kkc" Mar 20 07:50:00 crc kubenswrapper[4749]: I0320 07:50:00.997012 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566550-54kkc"] Mar 20 07:50:01 crc kubenswrapper[4749]: I0320 07:50:01.008768 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 07:50:01 crc kubenswrapper[4749]: I0320 07:50:01.157497 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566550-54kkc" event={"ID":"b925be05-c5d6-40bf-974b-2964fdc28a95","Type":"ContainerStarted","Data":"fa179d3fd98abdcfbb11f14363eb7b286568f7a3a501791cf7d4d753be80835f"} Mar 20 07:50:03 crc kubenswrapper[4749]: I0320 07:50:03.178593 4749 generic.go:334] "Generic (PLEG): container finished" podID="b925be05-c5d6-40bf-974b-2964fdc28a95" containerID="8ef8ce5580f70e3e0d1ed8b2cf72dc05232022a8f3e65ea8398f09e903439fb9" exitCode=0 Mar 20 07:50:03 crc kubenswrapper[4749]: I0320 07:50:03.178945 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566550-54kkc" event={"ID":"b925be05-c5d6-40bf-974b-2964fdc28a95","Type":"ContainerDied","Data":"8ef8ce5580f70e3e0d1ed8b2cf72dc05232022a8f3e65ea8398f09e903439fb9"} Mar 20 07:50:04 crc kubenswrapper[4749]: I0320 07:50:04.514225 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:50:04 crc kubenswrapper[4749]: I0320 07:50:04.514878 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:50:04 crc kubenswrapper[4749]: I0320 07:50:04.514923 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:50:04 crc kubenswrapper[4749]: I0320 07:50:04.515586 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14"} pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 07:50:04 crc kubenswrapper[4749]: I0320 07:50:04.515644 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" containerID="cri-o://0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" gracePeriod=600 Mar 20 07:50:04 crc kubenswrapper[4749]: I0320 07:50:04.570309 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566550-54kkc" Mar 20 07:50:04 crc kubenswrapper[4749]: E0320 07:50:04.644247 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:50:04 crc kubenswrapper[4749]: I0320 07:50:04.738306 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsstt\" (UniqueName: \"kubernetes.io/projected/b925be05-c5d6-40bf-974b-2964fdc28a95-kube-api-access-jsstt\") pod \"b925be05-c5d6-40bf-974b-2964fdc28a95\" (UID: \"b925be05-c5d6-40bf-974b-2964fdc28a95\") " Mar 20 07:50:04 crc kubenswrapper[4749]: I0320 07:50:04.745463 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b925be05-c5d6-40bf-974b-2964fdc28a95-kube-api-access-jsstt" (OuterVolumeSpecName: "kube-api-access-jsstt") pod "b925be05-c5d6-40bf-974b-2964fdc28a95" (UID: "b925be05-c5d6-40bf-974b-2964fdc28a95"). InnerVolumeSpecName "kube-api-access-jsstt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:50:04 crc kubenswrapper[4749]: I0320 07:50:04.839808 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsstt\" (UniqueName: \"kubernetes.io/projected/b925be05-c5d6-40bf-974b-2964fdc28a95-kube-api-access-jsstt\") on node \"crc\" DevicePath \"\"" Mar 20 07:50:05 crc kubenswrapper[4749]: I0320 07:50:05.197012 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566550-54kkc" event={"ID":"b925be05-c5d6-40bf-974b-2964fdc28a95","Type":"ContainerDied","Data":"fa179d3fd98abdcfbb11f14363eb7b286568f7a3a501791cf7d4d753be80835f"} Mar 20 07:50:05 crc kubenswrapper[4749]: I0320 07:50:05.197048 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566550-54kkc" Mar 20 07:50:05 crc kubenswrapper[4749]: I0320 07:50:05.197055 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa179d3fd98abdcfbb11f14363eb7b286568f7a3a501791cf7d4d753be80835f" Mar 20 07:50:05 crc kubenswrapper[4749]: I0320 07:50:05.201786 4749 generic.go:334] "Generic (PLEG): container finished" podID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" exitCode=0 Mar 20 07:50:05 crc kubenswrapper[4749]: I0320 07:50:05.201822 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerDied","Data":"0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14"} Mar 20 07:50:05 crc kubenswrapper[4749]: I0320 07:50:05.201855 4749 scope.go:117] "RemoveContainer" containerID="f0b6b46505f9df084ed8de0c0f1cf3091e394d293032dd62e13935f99ca383ee" Mar 20 07:50:05 crc kubenswrapper[4749]: I0320 07:50:05.202607 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:50:05 crc kubenswrapper[4749]: E0320 07:50:05.203068 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:50:05 crc kubenswrapper[4749]: I0320 07:50:05.641013 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566544-jkm2b"] Mar 20 07:50:05 crc kubenswrapper[4749]: I0320 07:50:05.646532 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566544-jkm2b"] Mar 20 07:50:06 crc kubenswrapper[4749]: I0320 07:50:06.189755 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a388828-add8-4a36-a802-7a12bc486545" path="/var/lib/kubelet/pods/9a388828-add8-4a36-a802-7a12bc486545/volumes" Mar 20 07:50:10 crc kubenswrapper[4749]: I0320 07:50:10.178021 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:50:10 crc kubenswrapper[4749]: E0320 07:50:10.178903 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:50:12 crc kubenswrapper[4749]: I0320 07:50:12.177501 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:50:12 crc kubenswrapper[4749]: E0320 07:50:12.178164 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:50:17 crc kubenswrapper[4749]: I0320 07:50:17.176995 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:50:17 crc kubenswrapper[4749]: E0320 07:50:17.177885 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:50:24 crc kubenswrapper[4749]: I0320 07:50:24.183596 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:50:24 crc kubenswrapper[4749]: E0320 07:50:24.184455 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:50:24 crc kubenswrapper[4749]: I0320 07:50:24.616531 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2fkn4"] Mar 20 07:50:24 crc kubenswrapper[4749]: E0320 07:50:24.616980 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b925be05-c5d6-40bf-974b-2964fdc28a95" containerName="oc" Mar 20 07:50:24 crc kubenswrapper[4749]: I0320 07:50:24.617007 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b925be05-c5d6-40bf-974b-2964fdc28a95" containerName="oc" Mar 20 07:50:24 crc kubenswrapper[4749]: I0320 07:50:24.617220 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b925be05-c5d6-40bf-974b-2964fdc28a95" containerName="oc" Mar 20 07:50:24 crc kubenswrapper[4749]: I0320 07:50:24.618649 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2fkn4" Mar 20 07:50:24 crc kubenswrapper[4749]: I0320 07:50:24.637771 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2fkn4"] Mar 20 07:50:24 crc kubenswrapper[4749]: I0320 07:50:24.715709 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2krsx\" (UniqueName: \"kubernetes.io/projected/c46b8569-66f8-4b4e-a646-53a71f3e4ef5-kube-api-access-2krsx\") pod \"certified-operators-2fkn4\" (UID: \"c46b8569-66f8-4b4e-a646-53a71f3e4ef5\") " pod="openshift-marketplace/certified-operators-2fkn4" Mar 20 07:50:24 crc kubenswrapper[4749]: I0320 07:50:24.716022 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c46b8569-66f8-4b4e-a646-53a71f3e4ef5-utilities\") pod \"certified-operators-2fkn4\" (UID: \"c46b8569-66f8-4b4e-a646-53a71f3e4ef5\") " pod="openshift-marketplace/certified-operators-2fkn4" Mar 20 07:50:24 crc kubenswrapper[4749]: I0320 07:50:24.716097 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c46b8569-66f8-4b4e-a646-53a71f3e4ef5-catalog-content\") pod \"certified-operators-2fkn4\" (UID: \"c46b8569-66f8-4b4e-a646-53a71f3e4ef5\") " pod="openshift-marketplace/certified-operators-2fkn4" Mar 20 07:50:24 crc kubenswrapper[4749]: I0320 07:50:24.817437 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c46b8569-66f8-4b4e-a646-53a71f3e4ef5-catalog-content\") pod \"certified-operators-2fkn4\" (UID: \"c46b8569-66f8-4b4e-a646-53a71f3e4ef5\") " pod="openshift-marketplace/certified-operators-2fkn4" Mar 20 07:50:24 crc kubenswrapper[4749]: I0320 07:50:24.817797 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2krsx\" (UniqueName: \"kubernetes.io/projected/c46b8569-66f8-4b4e-a646-53a71f3e4ef5-kube-api-access-2krsx\") pod \"certified-operators-2fkn4\" (UID: \"c46b8569-66f8-4b4e-a646-53a71f3e4ef5\") " pod="openshift-marketplace/certified-operators-2fkn4" Mar 20 07:50:24 crc kubenswrapper[4749]: I0320 07:50:24.817869 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c46b8569-66f8-4b4e-a646-53a71f3e4ef5-utilities\") pod \"certified-operators-2fkn4\" (UID: \"c46b8569-66f8-4b4e-a646-53a71f3e4ef5\") " pod="openshift-marketplace/certified-operators-2fkn4" Mar 20 07:50:24 crc kubenswrapper[4749]: I0320 07:50:24.818326 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c46b8569-66f8-4b4e-a646-53a71f3e4ef5-catalog-content\") pod \"certified-operators-2fkn4\" (UID: \"c46b8569-66f8-4b4e-a646-53a71f3e4ef5\") " pod="openshift-marketplace/certified-operators-2fkn4" Mar 20 07:50:24 crc kubenswrapper[4749]: I0320 07:50:24.818347 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c46b8569-66f8-4b4e-a646-53a71f3e4ef5-utilities\") pod \"certified-operators-2fkn4\" (UID: \"c46b8569-66f8-4b4e-a646-53a71f3e4ef5\") " pod="openshift-marketplace/certified-operators-2fkn4" Mar 20 07:50:24 crc kubenswrapper[4749]: I0320 07:50:24.840562 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2krsx\" (UniqueName: \"kubernetes.io/projected/c46b8569-66f8-4b4e-a646-53a71f3e4ef5-kube-api-access-2krsx\") pod \"certified-operators-2fkn4\" (UID: \"c46b8569-66f8-4b4e-a646-53a71f3e4ef5\") " pod="openshift-marketplace/certified-operators-2fkn4" Mar 20 07:50:24 crc kubenswrapper[4749]: I0320 07:50:24.936304 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2fkn4" Mar 20 07:50:25 crc kubenswrapper[4749]: I0320 07:50:25.177106 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:50:25 crc kubenswrapper[4749]: E0320 07:50:25.177930 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:50:25 crc kubenswrapper[4749]: I0320 07:50:25.423731 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2fkn4"] Mar 20 07:50:26 crc kubenswrapper[4749]: I0320 07:50:26.408603 4749 generic.go:334] "Generic (PLEG): container finished" podID="c46b8569-66f8-4b4e-a646-53a71f3e4ef5" containerID="0b4992d52865674773c62853a746e4fb03b0afba8b3319951367116491d6ae41" exitCode=0 Mar 20 07:50:26 crc kubenswrapper[4749]: I0320 07:50:26.408797 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2fkn4" event={"ID":"c46b8569-66f8-4b4e-a646-53a71f3e4ef5","Type":"ContainerDied","Data":"0b4992d52865674773c62853a746e4fb03b0afba8b3319951367116491d6ae41"} Mar 20 07:50:26 crc kubenswrapper[4749]: I0320 07:50:26.408992 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2fkn4" event={"ID":"c46b8569-66f8-4b4e-a646-53a71f3e4ef5","Type":"ContainerStarted","Data":"b001518cf0556af9bdf60250ed3711899d40bcc84cd786ea473cf3c8dde36bfb"} Mar 20 07:50:28 crc kubenswrapper[4749]: I0320 07:50:28.439914 4749 generic.go:334] "Generic (PLEG): container finished" podID="c46b8569-66f8-4b4e-a646-53a71f3e4ef5" containerID="41b708f7460dd24a175bfa249f7377458c26409709ffa95ddbeaf743e270c6e2" exitCode=0 Mar 20 07:50:28 crc kubenswrapper[4749]: I0320 07:50:28.440008 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2fkn4" event={"ID":"c46b8569-66f8-4b4e-a646-53a71f3e4ef5","Type":"ContainerDied","Data":"41b708f7460dd24a175bfa249f7377458c26409709ffa95ddbeaf743e270c6e2"} Mar 20 07:50:29 crc kubenswrapper[4749]: I0320 07:50:29.449367 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2fkn4" event={"ID":"c46b8569-66f8-4b4e-a646-53a71f3e4ef5","Type":"ContainerStarted","Data":"3b36c5441310cf43bd9e1cbc5c4f9ea6e4f268d4c984f33fab23f62699f58184"} Mar 20 07:50:29 crc kubenswrapper[4749]: I0320 07:50:29.471836 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2fkn4" podStartSLOduration=2.966487103 podStartE2EDuration="5.471795297s" podCreationTimestamp="2026-03-20 07:50:24 +0000 UTC" firstStartedPulling="2026-03-20 07:50:26.412849865 +0000 UTC m=+2262.962507562" lastFinishedPulling="2026-03-20 07:50:28.918158099 +0000 UTC m=+2265.467815756" observedRunningTime="2026-03-20 07:50:29.468137329 +0000 UTC m=+2266.017794996" watchObservedRunningTime="2026-03-20 07:50:29.471795297 +0000 UTC m=+2266.021452954" Mar 20 07:50:30 crc kubenswrapper[4749]: I0320 07:50:30.178706 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:50:30 crc kubenswrapper[4749]: E0320 07:50:30.179685 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:50:34 crc kubenswrapper[4749]: I0320 07:50:34.937273 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2fkn4" Mar 20 07:50:34 crc kubenswrapper[4749]: I0320 07:50:34.937962 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2fkn4" Mar 20 07:50:35 crc kubenswrapper[4749]: I0320 07:50:35.002375 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2fkn4" Mar 20 07:50:35 crc kubenswrapper[4749]: I0320 07:50:35.571225 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2fkn4" Mar 20 07:50:35 crc kubenswrapper[4749]: I0320 07:50:35.616048 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2fkn4"] Mar 20 07:50:37 crc kubenswrapper[4749]: I0320 07:50:37.177184 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:50:37 crc kubenswrapper[4749]: E0320 07:50:37.177488 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:50:37 crc kubenswrapper[4749]: I0320 07:50:37.520034 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2fkn4" podUID="c46b8569-66f8-4b4e-a646-53a71f3e4ef5" containerName="registry-server" containerID="cri-o://3b36c5441310cf43bd9e1cbc5c4f9ea6e4f268d4c984f33fab23f62699f58184" gracePeriod=2 Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.035204 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2fkn4" Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.142680 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c46b8569-66f8-4b4e-a646-53a71f3e4ef5-catalog-content\") pod \"c46b8569-66f8-4b4e-a646-53a71f3e4ef5\" (UID: \"c46b8569-66f8-4b4e-a646-53a71f3e4ef5\") " Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.142815 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2krsx\" (UniqueName: \"kubernetes.io/projected/c46b8569-66f8-4b4e-a646-53a71f3e4ef5-kube-api-access-2krsx\") pod \"c46b8569-66f8-4b4e-a646-53a71f3e4ef5\" (UID: \"c46b8569-66f8-4b4e-a646-53a71f3e4ef5\") " Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.142873 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c46b8569-66f8-4b4e-a646-53a71f3e4ef5-utilities\") pod \"c46b8569-66f8-4b4e-a646-53a71f3e4ef5\" (UID: \"c46b8569-66f8-4b4e-a646-53a71f3e4ef5\") " Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.143774 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c46b8569-66f8-4b4e-a646-53a71f3e4ef5-utilities" (OuterVolumeSpecName: "utilities") pod "c46b8569-66f8-4b4e-a646-53a71f3e4ef5" (UID: "c46b8569-66f8-4b4e-a646-53a71f3e4ef5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.150610 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c46b8569-66f8-4b4e-a646-53a71f3e4ef5-kube-api-access-2krsx" (OuterVolumeSpecName: "kube-api-access-2krsx") pod "c46b8569-66f8-4b4e-a646-53a71f3e4ef5" (UID: "c46b8569-66f8-4b4e-a646-53a71f3e4ef5"). InnerVolumeSpecName "kube-api-access-2krsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.194915 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c46b8569-66f8-4b4e-a646-53a71f3e4ef5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c46b8569-66f8-4b4e-a646-53a71f3e4ef5" (UID: "c46b8569-66f8-4b4e-a646-53a71f3e4ef5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.244429 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2krsx\" (UniqueName: \"kubernetes.io/projected/c46b8569-66f8-4b4e-a646-53a71f3e4ef5-kube-api-access-2krsx\") on node \"crc\" DevicePath \"\"" Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.244467 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c46b8569-66f8-4b4e-a646-53a71f3e4ef5-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.244479 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c46b8569-66f8-4b4e-a646-53a71f3e4ef5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.530922 4749 generic.go:334] "Generic (PLEG): container finished" podID="c46b8569-66f8-4b4e-a646-53a71f3e4ef5" containerID="3b36c5441310cf43bd9e1cbc5c4f9ea6e4f268d4c984f33fab23f62699f58184" exitCode=0 Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.531020 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2fkn4" Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.531043 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2fkn4" event={"ID":"c46b8569-66f8-4b4e-a646-53a71f3e4ef5","Type":"ContainerDied","Data":"3b36c5441310cf43bd9e1cbc5c4f9ea6e4f268d4c984f33fab23f62699f58184"} Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.531451 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2fkn4" event={"ID":"c46b8569-66f8-4b4e-a646-53a71f3e4ef5","Type":"ContainerDied","Data":"b001518cf0556af9bdf60250ed3711899d40bcc84cd786ea473cf3c8dde36bfb"} Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.531483 4749 scope.go:117] "RemoveContainer" containerID="3b36c5441310cf43bd9e1cbc5c4f9ea6e4f268d4c984f33fab23f62699f58184" Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.570333 4749 scope.go:117] "RemoveContainer" containerID="41b708f7460dd24a175bfa249f7377458c26409709ffa95ddbeaf743e270c6e2" Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.574017 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2fkn4"] Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.583875 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2fkn4"] Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.590539 4749 scope.go:117] "RemoveContainer" containerID="0b4992d52865674773c62853a746e4fb03b0afba8b3319951367116491d6ae41" Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.645595 4749 scope.go:117] "RemoveContainer" containerID="3b36c5441310cf43bd9e1cbc5c4f9ea6e4f268d4c984f33fab23f62699f58184" Mar 20 07:50:38 crc kubenswrapper[4749]: E0320 07:50:38.646096 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b36c5441310cf43bd9e1cbc5c4f9ea6e4f268d4c984f33fab23f62699f58184\": container with ID starting with 3b36c5441310cf43bd9e1cbc5c4f9ea6e4f268d4c984f33fab23f62699f58184 not found: ID does not exist" containerID="3b36c5441310cf43bd9e1cbc5c4f9ea6e4f268d4c984f33fab23f62699f58184" Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.646126 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b36c5441310cf43bd9e1cbc5c4f9ea6e4f268d4c984f33fab23f62699f58184"} err="failed to get container status \"3b36c5441310cf43bd9e1cbc5c4f9ea6e4f268d4c984f33fab23f62699f58184\": rpc error: code = NotFound desc = could not find container \"3b36c5441310cf43bd9e1cbc5c4f9ea6e4f268d4c984f33fab23f62699f58184\": container with ID starting with 3b36c5441310cf43bd9e1cbc5c4f9ea6e4f268d4c984f33fab23f62699f58184 not found: ID does not exist" Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.646146 4749 scope.go:117] "RemoveContainer" containerID="41b708f7460dd24a175bfa249f7377458c26409709ffa95ddbeaf743e270c6e2" Mar 20 07:50:38 crc kubenswrapper[4749]: E0320 07:50:38.646690 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41b708f7460dd24a175bfa249f7377458c26409709ffa95ddbeaf743e270c6e2\": container with ID starting with 41b708f7460dd24a175bfa249f7377458c26409709ffa95ddbeaf743e270c6e2 not found: ID does not exist" containerID="41b708f7460dd24a175bfa249f7377458c26409709ffa95ddbeaf743e270c6e2" Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.646736 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41b708f7460dd24a175bfa249f7377458c26409709ffa95ddbeaf743e270c6e2"} err="failed to get container status \"41b708f7460dd24a175bfa249f7377458c26409709ffa95ddbeaf743e270c6e2\": rpc error: code = NotFound desc = could not find container \"41b708f7460dd24a175bfa249f7377458c26409709ffa95ddbeaf743e270c6e2\": container with ID starting with 41b708f7460dd24a175bfa249f7377458c26409709ffa95ddbeaf743e270c6e2 not found: ID does not exist" Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.646793 4749 scope.go:117] "RemoveContainer" containerID="0b4992d52865674773c62853a746e4fb03b0afba8b3319951367116491d6ae41" Mar 20 07:50:38 crc kubenswrapper[4749]: E0320 07:50:38.647139 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b4992d52865674773c62853a746e4fb03b0afba8b3319951367116491d6ae41\": container with ID starting with 0b4992d52865674773c62853a746e4fb03b0afba8b3319951367116491d6ae41 not found: ID does not exist" containerID="0b4992d52865674773c62853a746e4fb03b0afba8b3319951367116491d6ae41" Mar 20 07:50:38 crc kubenswrapper[4749]: I0320 07:50:38.647202 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b4992d52865674773c62853a746e4fb03b0afba8b3319951367116491d6ae41"} err="failed to get container status \"0b4992d52865674773c62853a746e4fb03b0afba8b3319951367116491d6ae41\": rpc error: code = NotFound desc = could not find container \"0b4992d52865674773c62853a746e4fb03b0afba8b3319951367116491d6ae41\": container with ID starting with 0b4992d52865674773c62853a746e4fb03b0afba8b3319951367116491d6ae41 not found: ID does not exist" Mar 20 07:50:39 crc kubenswrapper[4749]: I0320 07:50:39.177744 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:50:39 crc kubenswrapper[4749]: E0320 07:50:39.178159 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:50:40 crc kubenswrapper[4749]: I0320 07:50:40.194466 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c46b8569-66f8-4b4e-a646-53a71f3e4ef5" path="/var/lib/kubelet/pods/c46b8569-66f8-4b4e-a646-53a71f3e4ef5/volumes" Mar 20 07:50:42 crc kubenswrapper[4749]: I0320 07:50:42.178328 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:50:42 crc kubenswrapper[4749]: E0320 07:50:42.179326 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:50:44 crc kubenswrapper[4749]: E0320 07:50:44.865135 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Mar 20 07:50:48 crc kubenswrapper[4749]: I0320 07:50:48.178386 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:50:48 crc kubenswrapper[4749]: E0320 07:50:48.179549 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:50:53 crc kubenswrapper[4749]: I0320 07:50:53.178227 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:50:53 crc kubenswrapper[4749]: E0320 07:50:53.178888 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:50:54 crc kubenswrapper[4749]: I0320 07:50:54.182207 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:50:54 crc kubenswrapper[4749]: E0320 07:50:54.182912 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:50:55 crc kubenswrapper[4749]: I0320 07:50:55.942920 4749 scope.go:117] "RemoveContainer" containerID="e5be83a5fc8f67aabb41d6f791b86d718017f21ece3e7b2b6ee9aaaf48f64558" Mar 20 07:50:59 crc kubenswrapper[4749]: I0320 07:50:59.177506 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:50:59 crc kubenswrapper[4749]: E0320 07:50:59.178151 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:51:03 crc kubenswrapper[4749]: I0320 07:51:03.232415 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wqxlp"] Mar 20 07:51:03 crc kubenswrapper[4749]: E0320 07:51:03.233470 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c46b8569-66f8-4b4e-a646-53a71f3e4ef5" containerName="extract-utilities" Mar 20 07:51:03 crc kubenswrapper[4749]: I0320 07:51:03.233492 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46b8569-66f8-4b4e-a646-53a71f3e4ef5" containerName="extract-utilities" Mar 20 07:51:03 crc kubenswrapper[4749]: E0320 07:51:03.233508 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c46b8569-66f8-4b4e-a646-53a71f3e4ef5" containerName="extract-content" Mar 20 07:51:03 crc kubenswrapper[4749]: I0320 07:51:03.233519 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46b8569-66f8-4b4e-a646-53a71f3e4ef5" containerName="extract-content" Mar 20 07:51:03 crc kubenswrapper[4749]: E0320 07:51:03.233566 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c46b8569-66f8-4b4e-a646-53a71f3e4ef5" containerName="registry-server" Mar 20 07:51:03 crc kubenswrapper[4749]: I0320 07:51:03.233574 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46b8569-66f8-4b4e-a646-53a71f3e4ef5" containerName="registry-server" Mar 20 07:51:03 crc kubenswrapper[4749]: I0320 07:51:03.233787 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c46b8569-66f8-4b4e-a646-53a71f3e4ef5" containerName="registry-server" Mar 20 07:51:03 crc kubenswrapper[4749]: I0320 07:51:03.235264 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wqxlp" Mar 20 07:51:03 crc kubenswrapper[4749]: I0320 07:51:03.241454 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wqxlp"] Mar 20 07:51:03 crc kubenswrapper[4749]: I0320 07:51:03.327363 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1efe4169-044f-4556-906e-8d193119ac73-catalog-content\") pod \"redhat-marketplace-wqxlp\" (UID: \"1efe4169-044f-4556-906e-8d193119ac73\") " pod="openshift-marketplace/redhat-marketplace-wqxlp" Mar 20 07:51:03 crc kubenswrapper[4749]: I0320 07:51:03.327784 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1efe4169-044f-4556-906e-8d193119ac73-utilities\") pod \"redhat-marketplace-wqxlp\" (UID: \"1efe4169-044f-4556-906e-8d193119ac73\") " pod="openshift-marketplace/redhat-marketplace-wqxlp" Mar 20 07:51:03 crc kubenswrapper[4749]: I0320 07:51:03.327816 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grvzx\" (UniqueName: \"kubernetes.io/projected/1efe4169-044f-4556-906e-8d193119ac73-kube-api-access-grvzx\") pod \"redhat-marketplace-wqxlp\" (UID: \"1efe4169-044f-4556-906e-8d193119ac73\") " pod="openshift-marketplace/redhat-marketplace-wqxlp" Mar 20 07:51:03 crc kubenswrapper[4749]: I0320 07:51:03.429614 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1efe4169-044f-4556-906e-8d193119ac73-catalog-content\") pod \"redhat-marketplace-wqxlp\" (UID: \"1efe4169-044f-4556-906e-8d193119ac73\") " pod="openshift-marketplace/redhat-marketplace-wqxlp" Mar 20 07:51:03 crc kubenswrapper[4749]: I0320 07:51:03.429713 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1efe4169-044f-4556-906e-8d193119ac73-utilities\") pod \"redhat-marketplace-wqxlp\" (UID: \"1efe4169-044f-4556-906e-8d193119ac73\") " pod="openshift-marketplace/redhat-marketplace-wqxlp" Mar 20 07:51:03 crc kubenswrapper[4749]: I0320 07:51:03.429735 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grvzx\" (UniqueName: \"kubernetes.io/projected/1efe4169-044f-4556-906e-8d193119ac73-kube-api-access-grvzx\") pod \"redhat-marketplace-wqxlp\" (UID: \"1efe4169-044f-4556-906e-8d193119ac73\") " pod="openshift-marketplace/redhat-marketplace-wqxlp" Mar 20 07:51:03 crc kubenswrapper[4749]: I0320 07:51:03.430396 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1efe4169-044f-4556-906e-8d193119ac73-catalog-content\") pod \"redhat-marketplace-wqxlp\" (UID: \"1efe4169-044f-4556-906e-8d193119ac73\") " pod="openshift-marketplace/redhat-marketplace-wqxlp" Mar 20 07:51:03 crc kubenswrapper[4749]: I0320 07:51:03.430600 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1efe4169-044f-4556-906e-8d193119ac73-utilities\") pod \"redhat-marketplace-wqxlp\" (UID: \"1efe4169-044f-4556-906e-8d193119ac73\") " pod="openshift-marketplace/redhat-marketplace-wqxlp" Mar 20 07:51:03 crc kubenswrapper[4749]: I0320 07:51:03.450516 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grvzx\" (UniqueName: \"kubernetes.io/projected/1efe4169-044f-4556-906e-8d193119ac73-kube-api-access-grvzx\") pod \"redhat-marketplace-wqxlp\" (UID: \"1efe4169-044f-4556-906e-8d193119ac73\") " pod="openshift-marketplace/redhat-marketplace-wqxlp" Mar 20 07:51:03 crc kubenswrapper[4749]: I0320 07:51:03.555981 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wqxlp" Mar 20 07:51:03 crc kubenswrapper[4749]: I0320 07:51:03.999432 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wqxlp"] Mar 20 07:51:04 crc kubenswrapper[4749]: I0320 07:51:04.177270 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:51:04 crc kubenswrapper[4749]: E0320 07:51:04.177857 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:51:04 crc kubenswrapper[4749]: I0320 07:51:04.781766 4749 generic.go:334] "Generic (PLEG): container finished" podID="1efe4169-044f-4556-906e-8d193119ac73" containerID="02ffa284a9234d0bc12e8533a3462a2a2052e399d6c33cbdcdc5532e23dce091" exitCode=0 Mar 20 07:51:04 crc kubenswrapper[4749]: I0320 07:51:04.781865 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqxlp" event={"ID":"1efe4169-044f-4556-906e-8d193119ac73","Type":"ContainerDied","Data":"02ffa284a9234d0bc12e8533a3462a2a2052e399d6c33cbdcdc5532e23dce091"} Mar 20 07:51:04 crc kubenswrapper[4749]: I0320 07:51:04.784638 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqxlp" event={"ID":"1efe4169-044f-4556-906e-8d193119ac73","Type":"ContainerStarted","Data":"65c2269dd637dab8563a1e761b5d52881e75115bc6862419389e5a1a7680cdf0"} Mar 20 07:51:05 crc kubenswrapper[4749]: I0320 07:51:05.792157 4749 generic.go:334] "Generic (PLEG): container finished" podID="1efe4169-044f-4556-906e-8d193119ac73" containerID="528ac675bb7a971c90b7155acd32a1b1e7d65ab7fe4769436af1137dcdd55edc" exitCode=0 Mar 20 07:51:05 crc kubenswrapper[4749]: I0320 07:51:05.792366 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqxlp" event={"ID":"1efe4169-044f-4556-906e-8d193119ac73","Type":"ContainerDied","Data":"528ac675bb7a971c90b7155acd32a1b1e7d65ab7fe4769436af1137dcdd55edc"} Mar 20 07:51:06 crc kubenswrapper[4749]: I0320 07:51:06.801312 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqxlp" event={"ID":"1efe4169-044f-4556-906e-8d193119ac73","Type":"ContainerStarted","Data":"6ebc664f4551579d21884cf09e6c7ef5b76ded0d6e8929e4e7a0ed037b997490"} Mar 20 07:51:06 crc kubenswrapper[4749]: I0320 07:51:06.820126 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wqxlp" podStartSLOduration=2.354891588 podStartE2EDuration="3.820109458s" podCreationTimestamp="2026-03-20 07:51:03 +0000 UTC" firstStartedPulling="2026-03-20 07:51:04.783761025 +0000 UTC m=+2301.333418672" lastFinishedPulling="2026-03-20 07:51:06.248978865 +0000 UTC m=+2302.798636542" observedRunningTime="2026-03-20 07:51:06.815816934 +0000 UTC m=+2303.365474591" watchObservedRunningTime="2026-03-20 07:51:06.820109458 +0000 UTC m=+2303.369767095" Mar 20 07:51:08 crc kubenswrapper[4749]: I0320 07:51:08.177761 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:51:08 crc kubenswrapper[4749]: E0320 07:51:08.178499 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:51:11 crc kubenswrapper[4749]: I0320 07:51:11.177704 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:51:11 crc kubenswrapper[4749]: E0320 07:51:11.178365 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:51:13 crc kubenswrapper[4749]: I0320 07:51:13.557151 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wqxlp" Mar 20 07:51:13 crc kubenswrapper[4749]: I0320 07:51:13.558495 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wqxlp" Mar 20 07:51:13 crc kubenswrapper[4749]: I0320 07:51:13.632911 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wqxlp" Mar 20 07:51:13 crc kubenswrapper[4749]: I0320 07:51:13.957411 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wqxlp" Mar 20 07:51:15 crc kubenswrapper[4749]: I0320 07:51:15.177398 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:51:15 crc kubenswrapper[4749]: E0320 07:51:15.177987 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:51:16 crc kubenswrapper[4749]: I0320 07:51:16.423624 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wqxlp"] Mar 20 07:51:16 crc kubenswrapper[4749]: I0320 07:51:16.890591 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wqxlp" podUID="1efe4169-044f-4556-906e-8d193119ac73" containerName="registry-server" containerID="cri-o://6ebc664f4551579d21884cf09e6c7ef5b76ded0d6e8929e4e7a0ed037b997490" gracePeriod=2 Mar 20 07:51:17 crc kubenswrapper[4749]: I0320 07:51:17.244433 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wqxlp" Mar 20 07:51:17 crc kubenswrapper[4749]: I0320 07:51:17.377134 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1efe4169-044f-4556-906e-8d193119ac73-catalog-content\") pod \"1efe4169-044f-4556-906e-8d193119ac73\" (UID: \"1efe4169-044f-4556-906e-8d193119ac73\") " Mar 20 07:51:17 crc kubenswrapper[4749]: I0320 07:51:17.377472 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1efe4169-044f-4556-906e-8d193119ac73-utilities\") pod \"1efe4169-044f-4556-906e-8d193119ac73\" (UID: \"1efe4169-044f-4556-906e-8d193119ac73\") " Mar 20 07:51:17 crc kubenswrapper[4749]: I0320 07:51:17.377538 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grvzx\" (UniqueName: \"kubernetes.io/projected/1efe4169-044f-4556-906e-8d193119ac73-kube-api-access-grvzx\") pod \"1efe4169-044f-4556-906e-8d193119ac73\" (UID: \"1efe4169-044f-4556-906e-8d193119ac73\") " Mar 20 07:51:17 crc kubenswrapper[4749]: I0320 07:51:17.380188 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1efe4169-044f-4556-906e-8d193119ac73-utilities" (OuterVolumeSpecName: "utilities") pod "1efe4169-044f-4556-906e-8d193119ac73" (UID: "1efe4169-044f-4556-906e-8d193119ac73"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:51:17 crc kubenswrapper[4749]: I0320 07:51:17.387251 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1efe4169-044f-4556-906e-8d193119ac73-kube-api-access-grvzx" (OuterVolumeSpecName: "kube-api-access-grvzx") pod "1efe4169-044f-4556-906e-8d193119ac73" (UID: "1efe4169-044f-4556-906e-8d193119ac73"). InnerVolumeSpecName "kube-api-access-grvzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:51:17 crc kubenswrapper[4749]: I0320 07:51:17.428451 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1efe4169-044f-4556-906e-8d193119ac73-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1efe4169-044f-4556-906e-8d193119ac73" (UID: "1efe4169-044f-4556-906e-8d193119ac73"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 07:51:17 crc kubenswrapper[4749]: I0320 07:51:17.480475 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1efe4169-044f-4556-906e-8d193119ac73-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 07:51:17 crc kubenswrapper[4749]: I0320 07:51:17.480519 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1efe4169-044f-4556-906e-8d193119ac73-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 07:51:17 crc kubenswrapper[4749]: I0320 07:51:17.480532 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grvzx\" (UniqueName: \"kubernetes.io/projected/1efe4169-044f-4556-906e-8d193119ac73-kube-api-access-grvzx\") on node \"crc\" DevicePath \"\"" Mar 20 07:51:17 crc kubenswrapper[4749]: I0320 07:51:17.901883 4749 generic.go:334] "Generic (PLEG): container finished" podID="1efe4169-044f-4556-906e-8d193119ac73" containerID="6ebc664f4551579d21884cf09e6c7ef5b76ded0d6e8929e4e7a0ed037b997490" exitCode=0 Mar 20 07:51:17 crc kubenswrapper[4749]: I0320 07:51:17.901944 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqxlp" event={"ID":"1efe4169-044f-4556-906e-8d193119ac73","Type":"ContainerDied","Data":"6ebc664f4551579d21884cf09e6c7ef5b76ded0d6e8929e4e7a0ed037b997490"} Mar 20 07:51:17 crc kubenswrapper[4749]: I0320 07:51:17.901996 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqxlp" event={"ID":"1efe4169-044f-4556-906e-8d193119ac73","Type":"ContainerDied","Data":"65c2269dd637dab8563a1e761b5d52881e75115bc6862419389e5a1a7680cdf0"} Mar 20 07:51:17 crc kubenswrapper[4749]: I0320 07:51:17.902009 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wqxlp" Mar 20 07:51:17 crc kubenswrapper[4749]: I0320 07:51:17.902047 4749 scope.go:117] "RemoveContainer" containerID="6ebc664f4551579d21884cf09e6c7ef5b76ded0d6e8929e4e7a0ed037b997490" Mar 20 07:51:17 crc kubenswrapper[4749]: I0320 07:51:17.929423 4749 scope.go:117] "RemoveContainer" containerID="528ac675bb7a971c90b7155acd32a1b1e7d65ab7fe4769436af1137dcdd55edc" Mar 20 07:51:17 crc kubenswrapper[4749]: I0320 07:51:17.960348 4749 scope.go:117] "RemoveContainer" containerID="02ffa284a9234d0bc12e8533a3462a2a2052e399d6c33cbdcdc5532e23dce091" Mar 20 07:51:17 crc kubenswrapper[4749]: I0320 07:51:17.961437 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wqxlp"] Mar 20 07:51:17 crc kubenswrapper[4749]: I0320 07:51:17.972261 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wqxlp"] Mar 20 07:51:18 crc kubenswrapper[4749]: I0320 07:51:18.010679 4749 scope.go:117] "RemoveContainer" containerID="6ebc664f4551579d21884cf09e6c7ef5b76ded0d6e8929e4e7a0ed037b997490" Mar 20 07:51:18 crc kubenswrapper[4749]: E0320 07:51:18.013260 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ebc664f4551579d21884cf09e6c7ef5b76ded0d6e8929e4e7a0ed037b997490\": container with ID starting with 6ebc664f4551579d21884cf09e6c7ef5b76ded0d6e8929e4e7a0ed037b997490 not found: ID does not exist" containerID="6ebc664f4551579d21884cf09e6c7ef5b76ded0d6e8929e4e7a0ed037b997490" Mar 20 07:51:18 crc kubenswrapper[4749]: I0320 07:51:18.013469 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ebc664f4551579d21884cf09e6c7ef5b76ded0d6e8929e4e7a0ed037b997490"} err="failed to get container status \"6ebc664f4551579d21884cf09e6c7ef5b76ded0d6e8929e4e7a0ed037b997490\": rpc error: code = NotFound desc = could not find container \"6ebc664f4551579d21884cf09e6c7ef5b76ded0d6e8929e4e7a0ed037b997490\": container with ID starting with 6ebc664f4551579d21884cf09e6c7ef5b76ded0d6e8929e4e7a0ed037b997490 not found: ID does not exist" Mar 20 07:51:18 crc kubenswrapper[4749]: I0320 07:51:18.013584 4749 scope.go:117] "RemoveContainer" containerID="528ac675bb7a971c90b7155acd32a1b1e7d65ab7fe4769436af1137dcdd55edc" Mar 20 07:51:18 crc kubenswrapper[4749]: E0320 07:51:18.014067 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"528ac675bb7a971c90b7155acd32a1b1e7d65ab7fe4769436af1137dcdd55edc\": container with ID starting with 528ac675bb7a971c90b7155acd32a1b1e7d65ab7fe4769436af1137dcdd55edc not found: ID does not exist" containerID="528ac675bb7a971c90b7155acd32a1b1e7d65ab7fe4769436af1137dcdd55edc" Mar 20 07:51:18 crc kubenswrapper[4749]: I0320 07:51:18.014207 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"528ac675bb7a971c90b7155acd32a1b1e7d65ab7fe4769436af1137dcdd55edc"} err="failed to get container status \"528ac675bb7a971c90b7155acd32a1b1e7d65ab7fe4769436af1137dcdd55edc\": rpc error: code = NotFound desc = could not find container \"528ac675bb7a971c90b7155acd32a1b1e7d65ab7fe4769436af1137dcdd55edc\": container with ID starting with 528ac675bb7a971c90b7155acd32a1b1e7d65ab7fe4769436af1137dcdd55edc not found: ID does not exist" Mar 20 07:51:18 crc kubenswrapper[4749]: I0320 07:51:18.014356 4749 scope.go:117] "RemoveContainer" containerID="02ffa284a9234d0bc12e8533a3462a2a2052e399d6c33cbdcdc5532e23dce091" Mar 20 07:51:18 crc kubenswrapper[4749]: E0320 07:51:18.014880 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02ffa284a9234d0bc12e8533a3462a2a2052e399d6c33cbdcdc5532e23dce091\": container with ID starting with 02ffa284a9234d0bc12e8533a3462a2a2052e399d6c33cbdcdc5532e23dce091 not found: ID does not exist" containerID="02ffa284a9234d0bc12e8533a3462a2a2052e399d6c33cbdcdc5532e23dce091" Mar 20 07:51:18 crc kubenswrapper[4749]: I0320 07:51:18.014925 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02ffa284a9234d0bc12e8533a3462a2a2052e399d6c33cbdcdc5532e23dce091"} err="failed to get container status \"02ffa284a9234d0bc12e8533a3462a2a2052e399d6c33cbdcdc5532e23dce091\": rpc error: code = NotFound desc = could not find container \"02ffa284a9234d0bc12e8533a3462a2a2052e399d6c33cbdcdc5532e23dce091\": container with ID starting with 02ffa284a9234d0bc12e8533a3462a2a2052e399d6c33cbdcdc5532e23dce091 not found: ID does not exist" Mar 20 07:51:18 crc kubenswrapper[4749]: I0320 07:51:18.197730 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1efe4169-044f-4556-906e-8d193119ac73" path="/var/lib/kubelet/pods/1efe4169-044f-4556-906e-8d193119ac73/volumes" Mar 20 07:51:22 crc kubenswrapper[4749]: I0320 07:51:22.177897 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:51:22 crc kubenswrapper[4749]: E0320 07:51:22.178923 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:51:25 crc kubenswrapper[4749]: I0320 07:51:25.177019 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:51:25 crc kubenswrapper[4749]: E0320 07:51:25.177513 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:51:30 crc kubenswrapper[4749]: I0320 07:51:30.177973 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:51:30 crc kubenswrapper[4749]: E0320 07:51:30.178640 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:51:36 crc kubenswrapper[4749]: I0320 07:51:36.178135 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:51:36 crc kubenswrapper[4749]: E0320 07:51:36.180105 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:51:39 crc kubenswrapper[4749]: I0320 07:51:39.178185 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:51:39 crc kubenswrapper[4749]: E0320 07:51:39.178921 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:51:45 crc kubenswrapper[4749]: I0320 07:51:45.178131 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:51:45 crc kubenswrapper[4749]: E0320 07:51:45.179391 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:51:47 crc kubenswrapper[4749]: I0320 07:51:47.177749 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:51:47 crc kubenswrapper[4749]: E0320 07:51:47.178210 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:51:51 crc kubenswrapper[4749]: I0320 07:51:51.177858 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:51:51 crc kubenswrapper[4749]: E0320 07:51:51.178759 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:51:58 crc kubenswrapper[4749]: I0320 07:51:58.177158 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:51:58 crc kubenswrapper[4749]: E0320 07:51:58.179411 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:52:00 crc kubenswrapper[4749]: I0320 07:52:00.155647 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566552-kw72f"] Mar 20 07:52:00 crc kubenswrapper[4749]: E0320 07:52:00.156184 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efe4169-044f-4556-906e-8d193119ac73" containerName="extract-utilities" Mar 20 07:52:00 crc kubenswrapper[4749]: I0320 07:52:00.156199 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efe4169-044f-4556-906e-8d193119ac73" containerName="extract-utilities" Mar 20 07:52:00 crc kubenswrapper[4749]: E0320 07:52:00.156217 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efe4169-044f-4556-906e-8d193119ac73" containerName="registry-server" Mar 20 07:52:00 crc kubenswrapper[4749]: I0320 07:52:00.156225 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efe4169-044f-4556-906e-8d193119ac73" containerName="registry-server" Mar 20 07:52:00 crc kubenswrapper[4749]: E0320 07:52:00.156262 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efe4169-044f-4556-906e-8d193119ac73" containerName="extract-content" Mar 20 07:52:00 crc kubenswrapper[4749]: I0320 07:52:00.156271 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efe4169-044f-4556-906e-8d193119ac73" containerName="extract-content" Mar 20 07:52:00 crc kubenswrapper[4749]: I0320 07:52:00.156495 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1efe4169-044f-4556-906e-8d193119ac73" containerName="registry-server" Mar 20 07:52:00 crc kubenswrapper[4749]: I0320 07:52:00.157061 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566552-kw72f" Mar 20 07:52:00 crc kubenswrapper[4749]: I0320 07:52:00.161816 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:52:00 crc kubenswrapper[4749]: I0320 07:52:00.162072 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 07:52:00 crc kubenswrapper[4749]: I0320 07:52:00.162240 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:52:00 crc kubenswrapper[4749]: I0320 07:52:00.174450 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566552-kw72f"] Mar 20 07:52:00 crc kubenswrapper[4749]: I0320 07:52:00.276495 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbk5b\" (UniqueName: \"kubernetes.io/projected/1f707152-0ff3-41e6-8b25-1c20197d6867-kube-api-access-sbk5b\") pod \"auto-csr-approver-29566552-kw72f\" (UID: \"1f707152-0ff3-41e6-8b25-1c20197d6867\") " pod="openshift-infra/auto-csr-approver-29566552-kw72f" Mar 20 07:52:00 crc kubenswrapper[4749]: I0320 07:52:00.379096 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbk5b\" (UniqueName: \"kubernetes.io/projected/1f707152-0ff3-41e6-8b25-1c20197d6867-kube-api-access-sbk5b\") pod \"auto-csr-approver-29566552-kw72f\" (UID: \"1f707152-0ff3-41e6-8b25-1c20197d6867\") " pod="openshift-infra/auto-csr-approver-29566552-kw72f" Mar 20 07:52:00 crc kubenswrapper[4749]: I0320 07:52:00.402907 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbk5b\" (UniqueName: \"kubernetes.io/projected/1f707152-0ff3-41e6-8b25-1c20197d6867-kube-api-access-sbk5b\") pod \"auto-csr-approver-29566552-kw72f\" (UID: \"1f707152-0ff3-41e6-8b25-1c20197d6867\") " pod="openshift-infra/auto-csr-approver-29566552-kw72f" Mar 20 07:52:00 crc kubenswrapper[4749]: I0320 07:52:00.490207 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566552-kw72f" Mar 20 07:52:00 crc kubenswrapper[4749]: I0320 07:52:00.975863 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566552-kw72f"] Mar 20 07:52:00 crc kubenswrapper[4749]: W0320 07:52:00.981004 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f707152_0ff3_41e6_8b25_1c20197d6867.slice/crio-b3efc3d152adc5e65805233c17c989cbec316c4e3be3d4c87aea8a958cad7886 WatchSource:0}: Error finding container b3efc3d152adc5e65805233c17c989cbec316c4e3be3d4c87aea8a958cad7886: Status 404 returned error can't find the container with id b3efc3d152adc5e65805233c17c989cbec316c4e3be3d4c87aea8a958cad7886 Mar 20 07:52:01 crc kubenswrapper[4749]: I0320 07:52:01.360117 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566552-kw72f" event={"ID":"1f707152-0ff3-41e6-8b25-1c20197d6867","Type":"ContainerStarted","Data":"b3efc3d152adc5e65805233c17c989cbec316c4e3be3d4c87aea8a958cad7886"} Mar 20 07:52:02 crc kubenswrapper[4749]: I0320 07:52:02.179687 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:52:02 crc kubenswrapper[4749]: I0320 07:52:02.180384 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:52:02 crc kubenswrapper[4749]: E0320 07:52:02.180844 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:52:02 crc kubenswrapper[4749]: E0320 07:52:02.180915 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:52:02 crc kubenswrapper[4749]: I0320 07:52:02.370546 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566552-kw72f" event={"ID":"1f707152-0ff3-41e6-8b25-1c20197d6867","Type":"ContainerStarted","Data":"5611713c0870a87be8edd0f11e4c9897f4d9e4b66eed4c5ddb8bfc3c5005575b"} Mar 20 07:52:02 crc kubenswrapper[4749]: I0320 07:52:02.390787 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566552-kw72f" podStartSLOduration=1.474942988 podStartE2EDuration="2.390765884s" podCreationTimestamp="2026-03-20 07:52:00 +0000 UTC" firstStartedPulling="2026-03-20 07:52:00.984755171 +0000 UTC m=+2357.534412858" lastFinishedPulling="2026-03-20 07:52:01.900578097 +0000 UTC m=+2358.450235754" observedRunningTime="2026-03-20 07:52:02.383445456 +0000 UTC m=+2358.933103113" watchObservedRunningTime="2026-03-20 07:52:02.390765884 +0000 UTC m=+2358.940423541" Mar 20 07:52:03 crc kubenswrapper[4749]: I0320 07:52:03.383103 4749 generic.go:334] "Generic (PLEG): container finished" podID="1f707152-0ff3-41e6-8b25-1c20197d6867" containerID="5611713c0870a87be8edd0f11e4c9897f4d9e4b66eed4c5ddb8bfc3c5005575b" exitCode=0 Mar 20 07:52:03 crc kubenswrapper[4749]: I0320 07:52:03.383184 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566552-kw72f" event={"ID":"1f707152-0ff3-41e6-8b25-1c20197d6867","Type":"ContainerDied","Data":"5611713c0870a87be8edd0f11e4c9897f4d9e4b66eed4c5ddb8bfc3c5005575b"} Mar 20 07:52:04 crc kubenswrapper[4749]: I0320 07:52:04.800056 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566552-kw72f" Mar 20 07:52:04 crc kubenswrapper[4749]: I0320 07:52:04.856085 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbk5b\" (UniqueName: \"kubernetes.io/projected/1f707152-0ff3-41e6-8b25-1c20197d6867-kube-api-access-sbk5b\") pod \"1f707152-0ff3-41e6-8b25-1c20197d6867\" (UID: \"1f707152-0ff3-41e6-8b25-1c20197d6867\") " Mar 20 07:52:04 crc kubenswrapper[4749]: I0320 07:52:04.865667 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f707152-0ff3-41e6-8b25-1c20197d6867-kube-api-access-sbk5b" (OuterVolumeSpecName: "kube-api-access-sbk5b") pod "1f707152-0ff3-41e6-8b25-1c20197d6867" (UID: "1f707152-0ff3-41e6-8b25-1c20197d6867"). InnerVolumeSpecName "kube-api-access-sbk5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:52:04 crc kubenswrapper[4749]: I0320 07:52:04.958053 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbk5b\" (UniqueName: \"kubernetes.io/projected/1f707152-0ff3-41e6-8b25-1c20197d6867-kube-api-access-sbk5b\") on node \"crc\" DevicePath \"\"" Mar 20 07:52:05 crc kubenswrapper[4749]: I0320 07:52:05.403546 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566552-kw72f" event={"ID":"1f707152-0ff3-41e6-8b25-1c20197d6867","Type":"ContainerDied","Data":"b3efc3d152adc5e65805233c17c989cbec316c4e3be3d4c87aea8a958cad7886"} Mar 20 07:52:05 crc kubenswrapper[4749]: I0320 07:52:05.404171 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3efc3d152adc5e65805233c17c989cbec316c4e3be3d4c87aea8a958cad7886" Mar 20 07:52:05 crc kubenswrapper[4749]: I0320 07:52:05.403660 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566552-kw72f" Mar 20 07:52:05 crc kubenswrapper[4749]: I0320 07:52:05.475431 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566546-wqdzs"] Mar 20 07:52:05 crc kubenswrapper[4749]: I0320 07:52:05.491277 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566546-wqdzs"] Mar 20 07:52:06 crc kubenswrapper[4749]: I0320 07:52:06.194376 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bef987c-8038-436d-a79e-c2346c61050b" path="/var/lib/kubelet/pods/2bef987c-8038-436d-a79e-c2346c61050b/volumes" Mar 20 07:52:13 crc kubenswrapper[4749]: I0320 07:52:13.178678 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:52:13 crc kubenswrapper[4749]: E0320 07:52:13.179508 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:52:14 crc kubenswrapper[4749]: I0320 07:52:14.188616 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:52:14 crc kubenswrapper[4749]: E0320 07:52:14.191333 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:52:17 crc kubenswrapper[4749]: I0320 07:52:17.178316 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:52:17 crc kubenswrapper[4749]: E0320 07:52:17.178938 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:52:25 crc kubenswrapper[4749]: I0320 07:52:25.177987 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:52:25 crc kubenswrapper[4749]: E0320 07:52:25.178962 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:52:28 crc kubenswrapper[4749]: I0320 07:52:28.178109 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:52:28 crc kubenswrapper[4749]: E0320 07:52:28.178855 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:52:32 crc kubenswrapper[4749]: I0320 07:52:32.177734 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:52:32 crc kubenswrapper[4749]: E0320 07:52:32.178593 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:52:37 crc kubenswrapper[4749]: I0320 07:52:37.177679 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:52:37 crc kubenswrapper[4749]: E0320 07:52:37.178385 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:52:43 crc kubenswrapper[4749]: I0320 07:52:43.178078 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:52:43 crc kubenswrapper[4749]: E0320 07:52:43.179174 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:52:46 crc kubenswrapper[4749]: I0320 07:52:46.177423 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:52:46 crc kubenswrapper[4749]: E0320 07:52:46.178361 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:52:52 crc kubenswrapper[4749]: I0320 07:52:52.177568 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:52:52 crc kubenswrapper[4749]: E0320 07:52:52.178425 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:52:54 crc kubenswrapper[4749]: I0320 07:52:54.182827 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:52:54 crc kubenswrapper[4749]: E0320 07:52:54.183439 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:52:56 crc kubenswrapper[4749]: I0320 07:52:56.089417 4749 scope.go:117] "RemoveContainer" containerID="eebcd41ddfad961cfc1f5283ed6419611c3a48008724ae585e93adf2005c59d6" Mar 20 07:53:00 crc kubenswrapper[4749]: I0320 07:53:00.178467 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:53:00 crc kubenswrapper[4749]: E0320 07:53:00.179082 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:53:03 crc kubenswrapper[4749]: I0320 07:53:03.178559 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:53:03 crc kubenswrapper[4749]: E0320 07:53:03.179271 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:53:05 crc kubenswrapper[4749]: I0320 07:53:05.178332 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:53:05 crc kubenswrapper[4749]: E0320 07:53:05.180384 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:53:12 crc kubenswrapper[4749]: I0320 07:53:12.178196 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:53:12 crc kubenswrapper[4749]: E0320 07:53:12.179522 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:53:16 crc kubenswrapper[4749]: I0320 07:53:16.178648 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:53:16 crc kubenswrapper[4749]: E0320 07:53:16.179656 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:53:20 crc kubenswrapper[4749]: I0320 07:53:20.178085 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:53:20 crc kubenswrapper[4749]: E0320 07:53:20.179175 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:53:25 crc kubenswrapper[4749]: I0320 07:53:25.177857 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:53:25 crc kubenswrapper[4749]: E0320 07:53:25.179021 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:53:29 crc kubenswrapper[4749]: I0320 07:53:29.178116 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:53:29 crc kubenswrapper[4749]: E0320 07:53:29.178859 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:53:32 crc kubenswrapper[4749]: I0320 07:53:32.178147 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:53:32 crc kubenswrapper[4749]: E0320 07:53:32.178652 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:53:36 crc kubenswrapper[4749]: I0320 07:53:36.178387 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:53:36 crc kubenswrapper[4749]: E0320 07:53:36.179241 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:53:42 crc kubenswrapper[4749]: I0320 07:53:42.177198 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:53:42 crc kubenswrapper[4749]: E0320 07:53:42.178083 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:53:46 crc kubenswrapper[4749]: I0320 07:53:46.177486 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:53:46 crc kubenswrapper[4749]: I0320 07:53:46.959387 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerStarted","Data":"fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e"} Mar 20 07:53:46 crc kubenswrapper[4749]: I0320 07:53:46.959629 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:53:48 crc kubenswrapper[4749]: I0320 07:53:48.179367 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:53:48 crc kubenswrapper[4749]: E0320 07:53:48.179883 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:53:51 crc kubenswrapper[4749]: I0320 07:53:51.035530 4749 generic.go:334] "Generic (PLEG): container finished" podID="8b9b402f-2d95-48f5-98d8-497d90956ba2" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" exitCode=0 Mar 20 07:53:51 crc kubenswrapper[4749]: I0320 07:53:51.035903 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerDied","Data":"fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e"} Mar 20 07:53:51 crc kubenswrapper[4749]: I0320 07:53:51.035970 4749 scope.go:117] "RemoveContainer" containerID="6623fc7f986ec52db4e939e8a32c1ab92b205fb441e21157a62a09d0ff2ab3ec" Mar 20 07:53:51 crc kubenswrapper[4749]: I0320 07:53:51.036842 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:53:51 crc kubenswrapper[4749]: E0320 07:53:51.037474 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:53:57 crc kubenswrapper[4749]: I0320 07:53:57.177130 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:53:58 crc kubenswrapper[4749]: I0320 07:53:58.109764 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerStarted","Data":"2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa"} Mar 20 07:53:58 crc kubenswrapper[4749]: I0320 07:53:58.110820 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 07:54:00 crc kubenswrapper[4749]: I0320 07:54:00.171018 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566554-7qzth"] Mar 20 07:54:00 crc kubenswrapper[4749]: E0320 07:54:00.171548 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f707152-0ff3-41e6-8b25-1c20197d6867" containerName="oc" Mar 20 07:54:00 crc kubenswrapper[4749]: I0320 07:54:00.171560 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f707152-0ff3-41e6-8b25-1c20197d6867" containerName="oc" Mar 20 07:54:00 crc kubenswrapper[4749]: I0320 07:54:00.171732 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f707152-0ff3-41e6-8b25-1c20197d6867" containerName="oc" Mar 20 07:54:00 crc kubenswrapper[4749]: I0320 07:54:00.172245 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566554-7qzth" Mar 20 07:54:00 crc kubenswrapper[4749]: I0320 07:54:00.174772 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 07:54:00 crc kubenswrapper[4749]: I0320 07:54:00.176004 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:54:00 crc kubenswrapper[4749]: I0320 07:54:00.176206 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:54:00 crc kubenswrapper[4749]: I0320 07:54:00.180719 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:54:00 crc kubenswrapper[4749]: E0320 07:54:00.181015 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:54:00 crc kubenswrapper[4749]: I0320 07:54:00.208772 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566554-7qzth"] Mar 20 07:54:00 crc kubenswrapper[4749]: I0320 07:54:00.303973 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvffm\" (UniqueName: \"kubernetes.io/projected/463568fe-3529-4730-a2e6-6317aef7e2a7-kube-api-access-lvffm\") pod \"auto-csr-approver-29566554-7qzth\" (UID: \"463568fe-3529-4730-a2e6-6317aef7e2a7\") " pod="openshift-infra/auto-csr-approver-29566554-7qzth" Mar 20 07:54:00 crc kubenswrapper[4749]: I0320 07:54:00.405562 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvffm\" (UniqueName: \"kubernetes.io/projected/463568fe-3529-4730-a2e6-6317aef7e2a7-kube-api-access-lvffm\") pod \"auto-csr-approver-29566554-7qzth\" (UID: \"463568fe-3529-4730-a2e6-6317aef7e2a7\") " pod="openshift-infra/auto-csr-approver-29566554-7qzth" Mar 20 07:54:00 crc kubenswrapper[4749]: I0320 07:54:00.429951 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvffm\" (UniqueName: \"kubernetes.io/projected/463568fe-3529-4730-a2e6-6317aef7e2a7-kube-api-access-lvffm\") pod \"auto-csr-approver-29566554-7qzth\" (UID: \"463568fe-3529-4730-a2e6-6317aef7e2a7\") " pod="openshift-infra/auto-csr-approver-29566554-7qzth" Mar 20 07:54:00 crc kubenswrapper[4749]: I0320 07:54:00.507626 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566554-7qzth" Mar 20 07:54:00 crc kubenswrapper[4749]: I0320 07:54:00.911648 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566554-7qzth"] Mar 20 07:54:01 crc kubenswrapper[4749]: I0320 07:54:01.150437 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566554-7qzth" event={"ID":"463568fe-3529-4730-a2e6-6317aef7e2a7","Type":"ContainerStarted","Data":"7dac789c65508105af6a20822f4df371daba3785f9ff72a0ea8b8cd8d3a526bf"} Mar 20 07:54:02 crc kubenswrapper[4749]: I0320 07:54:02.159490 4749 generic.go:334] "Generic (PLEG): container finished" podID="8db06e36-0b00-4157-9345-69449da3e85f" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" exitCode=0 Mar 20 07:54:02 crc kubenswrapper[4749]: I0320 07:54:02.159572 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerDied","Data":"2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa"} Mar 20 07:54:02 crc kubenswrapper[4749]: I0320 07:54:02.159940 4749 scope.go:117] "RemoveContainer" containerID="364d6fbf9694bce4c59b5da272a600ca47f38998291aad7a3137e901c1d7e851" Mar 20 07:54:02 crc kubenswrapper[4749]: I0320 07:54:02.160664 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:54:02 crc kubenswrapper[4749]: E0320 07:54:02.161008 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:54:02 crc kubenswrapper[4749]: I0320 07:54:02.162116 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566554-7qzth" event={"ID":"463568fe-3529-4730-a2e6-6317aef7e2a7","Type":"ContainerStarted","Data":"55797b51668cbef0733402e66e5942cf6e000dd7dad4463f624bd912f5188d36"} Mar 20 07:54:02 crc kubenswrapper[4749]: I0320 07:54:02.178340 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:54:02 crc kubenswrapper[4749]: E0320 07:54:02.178637 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:54:02 crc kubenswrapper[4749]: I0320 07:54:02.248677 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566554-7qzth" podStartSLOduration=1.355495628 podStartE2EDuration="2.248655824s" podCreationTimestamp="2026-03-20 07:54:00 +0000 UTC" firstStartedPulling="2026-03-20 07:54:00.92290792 +0000 UTC m=+2477.472565567" lastFinishedPulling="2026-03-20 07:54:01.816068096 +0000 UTC m=+2478.365725763" observedRunningTime="2026-03-20 07:54:02.23741182 +0000 UTC m=+2478.787069477" watchObservedRunningTime="2026-03-20 07:54:02.248655824 +0000 UTC m=+2478.798313481" Mar 20 07:54:03 crc kubenswrapper[4749]: I0320 07:54:03.182693 4749 generic.go:334] "Generic (PLEG): container finished" podID="463568fe-3529-4730-a2e6-6317aef7e2a7" containerID="55797b51668cbef0733402e66e5942cf6e000dd7dad4463f624bd912f5188d36" exitCode=0 Mar 20 07:54:03 crc kubenswrapper[4749]: I0320 07:54:03.182839 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566554-7qzth" event={"ID":"463568fe-3529-4730-a2e6-6317aef7e2a7","Type":"ContainerDied","Data":"55797b51668cbef0733402e66e5942cf6e000dd7dad4463f624bd912f5188d36"} Mar 20 07:54:04 crc kubenswrapper[4749]: I0320 07:54:04.546615 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566554-7qzth" Mar 20 07:54:04 crc kubenswrapper[4749]: I0320 07:54:04.578236 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvffm\" (UniqueName: \"kubernetes.io/projected/463568fe-3529-4730-a2e6-6317aef7e2a7-kube-api-access-lvffm\") pod \"463568fe-3529-4730-a2e6-6317aef7e2a7\" (UID: \"463568fe-3529-4730-a2e6-6317aef7e2a7\") " Mar 20 07:54:04 crc kubenswrapper[4749]: I0320 07:54:04.585123 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/463568fe-3529-4730-a2e6-6317aef7e2a7-kube-api-access-lvffm" (OuterVolumeSpecName: "kube-api-access-lvffm") pod "463568fe-3529-4730-a2e6-6317aef7e2a7" (UID: "463568fe-3529-4730-a2e6-6317aef7e2a7"). InnerVolumeSpecName "kube-api-access-lvffm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:54:04 crc kubenswrapper[4749]: I0320 07:54:04.679879 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvffm\" (UniqueName: \"kubernetes.io/projected/463568fe-3529-4730-a2e6-6317aef7e2a7-kube-api-access-lvffm\") on node \"crc\" DevicePath \"\"" Mar 20 07:54:05 crc kubenswrapper[4749]: I0320 07:54:05.228844 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566554-7qzth" event={"ID":"463568fe-3529-4730-a2e6-6317aef7e2a7","Type":"ContainerDied","Data":"7dac789c65508105af6a20822f4df371daba3785f9ff72a0ea8b8cd8d3a526bf"} Mar 20 07:54:05 crc kubenswrapper[4749]: I0320 07:54:05.228903 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dac789c65508105af6a20822f4df371daba3785f9ff72a0ea8b8cd8d3a526bf" Mar 20 07:54:05 crc kubenswrapper[4749]: I0320 07:54:05.229086 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566554-7qzth" Mar 20 07:54:05 crc kubenswrapper[4749]: I0320 07:54:05.629054 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566548-6p4xc"] Mar 20 07:54:05 crc kubenswrapper[4749]: I0320 07:54:05.635322 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566548-6p4xc"] Mar 20 07:54:06 crc kubenswrapper[4749]: I0320 07:54:06.186950 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b522c08e-c76b-4793-9947-5a5b53b5d5ba" path="/var/lib/kubelet/pods/b522c08e-c76b-4793-9947-5a5b53b5d5ba/volumes" Mar 20 07:54:13 crc kubenswrapper[4749]: I0320 07:54:13.177435 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:54:13 crc kubenswrapper[4749]: I0320 07:54:13.177945 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:54:13 crc kubenswrapper[4749]: E0320 07:54:13.178266 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:54:13 crc kubenswrapper[4749]: E0320 07:54:13.178559 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:54:16 crc kubenswrapper[4749]: I0320 07:54:16.177688 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:54:16 crc kubenswrapper[4749]: E0320 07:54:16.178602 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:54:27 crc kubenswrapper[4749]: I0320 07:54:27.177806 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:54:27 crc kubenswrapper[4749]: I0320 07:54:27.178471 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:54:27 crc kubenswrapper[4749]: I0320 07:54:27.178531 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:54:27 crc kubenswrapper[4749]: E0320 07:54:27.178667 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:54:27 crc kubenswrapper[4749]: E0320 07:54:27.178694 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:54:27 crc kubenswrapper[4749]: E0320 07:54:27.178911 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:54:41 crc kubenswrapper[4749]: I0320 07:54:41.177560 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:54:41 crc kubenswrapper[4749]: I0320 07:54:41.180014 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:54:41 crc kubenswrapper[4749]: E0320 07:54:41.180439 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:54:41 crc kubenswrapper[4749]: E0320 07:54:41.180619 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:54:42 crc kubenswrapper[4749]: I0320 07:54:42.179020 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:54:42 crc kubenswrapper[4749]: E0320 07:54:42.179601 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:54:54 crc kubenswrapper[4749]: I0320 07:54:54.188595 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:54:54 crc kubenswrapper[4749]: E0320 07:54:54.189802 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:54:55 crc kubenswrapper[4749]: I0320 07:54:55.177322 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:54:55 crc kubenswrapper[4749]: I0320 07:54:55.177645 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:54:55 crc kubenswrapper[4749]: E0320 07:54:55.177755 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 07:54:55 crc kubenswrapper[4749]: E0320 07:54:55.177870 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:54:56 crc kubenswrapper[4749]: I0320 07:54:56.209185 4749 scope.go:117] "RemoveContainer" containerID="d74c60bc449ab7677872ff0176882e4d21d6538d1fc6893167e08391845666ab" Mar 20 07:55:08 crc kubenswrapper[4749]: I0320 07:55:08.177964 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:55:08 crc kubenswrapper[4749]: E0320 07:55:08.178811 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:55:10 crc kubenswrapper[4749]: I0320 07:55:10.177421 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:55:10 crc kubenswrapper[4749]: I0320 07:55:10.177674 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:55:10 crc kubenswrapper[4749]: E0320 07:55:10.177704 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:55:10 crc kubenswrapper[4749]: I0320 07:55:10.829739 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerStarted","Data":"5b4585b69865061d7b06cc9fea1a7b04c408f0d820173decb8cf2fe0fabf3eda"} Mar 20 07:55:21 crc kubenswrapper[4749]: I0320 07:55:21.177426 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:55:21 crc kubenswrapper[4749]: E0320 07:55:21.178183 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:55:23 crc kubenswrapper[4749]: I0320 07:55:23.177181 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:55:23 crc kubenswrapper[4749]: E0320 07:55:23.177571 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:55:35 crc kubenswrapper[4749]: I0320 07:55:35.177399 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:55:35 crc kubenswrapper[4749]: I0320 07:55:35.178219 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:55:35 crc kubenswrapper[4749]: E0320 07:55:35.178537 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:55:35 crc kubenswrapper[4749]: E0320 07:55:35.178609 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:55:46 crc kubenswrapper[4749]: I0320 07:55:46.177015 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:55:46 crc kubenswrapper[4749]: E0320 07:55:46.177959 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:55:48 crc kubenswrapper[4749]: I0320 07:55:48.177767 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:55:48 crc kubenswrapper[4749]: E0320 07:55:48.178709 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:55:57 crc kubenswrapper[4749]: I0320 07:55:57.177491 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:55:57 crc kubenswrapper[4749]: E0320 07:55:57.178606 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:56:00 crc kubenswrapper[4749]: I0320 07:56:00.165136 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566556-42bpd"] Mar 20 07:56:00 crc kubenswrapper[4749]: E0320 07:56:00.165936 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463568fe-3529-4730-a2e6-6317aef7e2a7" containerName="oc" Mar 20 07:56:00 crc kubenswrapper[4749]: I0320 07:56:00.165955 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="463568fe-3529-4730-a2e6-6317aef7e2a7" containerName="oc" Mar 20 07:56:00 crc kubenswrapper[4749]: I0320 07:56:00.166216 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="463568fe-3529-4730-a2e6-6317aef7e2a7" containerName="oc" Mar 20 07:56:00 crc kubenswrapper[4749]: I0320 07:56:00.181491 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566556-42bpd" Mar 20 07:56:00 crc kubenswrapper[4749]: I0320 07:56:00.182601 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:56:00 crc kubenswrapper[4749]: E0320 07:56:00.182881 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:56:00 crc kubenswrapper[4749]: I0320 07:56:00.184882 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 07:56:00 crc kubenswrapper[4749]: I0320 07:56:00.185466 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:56:00 crc kubenswrapper[4749]: I0320 07:56:00.185923 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:56:00 crc kubenswrapper[4749]: I0320 07:56:00.194169 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566556-42bpd"] Mar 20 07:56:00 crc kubenswrapper[4749]: I0320 07:56:00.235699 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlccg\" (UniqueName: \"kubernetes.io/projected/8e17fd93-fb61-4e2f-86c5-cfafb328c79c-kube-api-access-wlccg\") pod \"auto-csr-approver-29566556-42bpd\" (UID: \"8e17fd93-fb61-4e2f-86c5-cfafb328c79c\") " pod="openshift-infra/auto-csr-approver-29566556-42bpd" Mar 20 07:56:00 crc kubenswrapper[4749]: I0320 07:56:00.337943 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlccg\" (UniqueName: \"kubernetes.io/projected/8e17fd93-fb61-4e2f-86c5-cfafb328c79c-kube-api-access-wlccg\") pod \"auto-csr-approver-29566556-42bpd\" (UID: \"8e17fd93-fb61-4e2f-86c5-cfafb328c79c\") " pod="openshift-infra/auto-csr-approver-29566556-42bpd" Mar 20 07:56:00 crc kubenswrapper[4749]: I0320 07:56:00.360916 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlccg\" (UniqueName: \"kubernetes.io/projected/8e17fd93-fb61-4e2f-86c5-cfafb328c79c-kube-api-access-wlccg\") pod \"auto-csr-approver-29566556-42bpd\" (UID: \"8e17fd93-fb61-4e2f-86c5-cfafb328c79c\") " pod="openshift-infra/auto-csr-approver-29566556-42bpd" Mar 20 07:56:00 crc kubenswrapper[4749]: I0320 07:56:00.514119 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566556-42bpd" Mar 20 07:56:00 crc kubenswrapper[4749]: I0320 07:56:00.837255 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566556-42bpd"] Mar 20 07:56:00 crc kubenswrapper[4749]: I0320 07:56:00.849326 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 07:56:01 crc kubenswrapper[4749]: I0320 07:56:01.335353 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566556-42bpd" event={"ID":"8e17fd93-fb61-4e2f-86c5-cfafb328c79c","Type":"ContainerStarted","Data":"4b47602dcdab8bb8bf064cf513fe20e812a503c94ecb0e432281efb86955fc61"} Mar 20 07:56:02 crc kubenswrapper[4749]: I0320 07:56:02.351047 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566556-42bpd" event={"ID":"8e17fd93-fb61-4e2f-86c5-cfafb328c79c","Type":"ContainerStarted","Data":"9704d59f05eae35673fe373c8ee22f03b0c4a5aaed0685139fe32f447db41b06"} Mar 20 07:56:02 crc kubenswrapper[4749]: I0320 07:56:02.373031 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566556-42bpd" podStartSLOduration=1.422694016 podStartE2EDuration="2.372993996s" podCreationTimestamp="2026-03-20 07:56:00 +0000 UTC" firstStartedPulling="2026-03-20 07:56:00.849052229 +0000 UTC m=+2597.398709946" lastFinishedPulling="2026-03-20 07:56:01.799352239 +0000 UTC m=+2598.349009926" observedRunningTime="2026-03-20 07:56:02.368456568 +0000 UTC m=+2598.918114255" watchObservedRunningTime="2026-03-20 07:56:02.372993996 +0000 UTC m=+2598.922651653" Mar 20 07:56:03 crc kubenswrapper[4749]: I0320 07:56:03.372507 4749 generic.go:334] "Generic (PLEG): container finished" podID="8e17fd93-fb61-4e2f-86c5-cfafb328c79c" containerID="9704d59f05eae35673fe373c8ee22f03b0c4a5aaed0685139fe32f447db41b06" exitCode=0 Mar 20 07:56:03 crc kubenswrapper[4749]: I0320 07:56:03.372840 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566556-42bpd" event={"ID":"8e17fd93-fb61-4e2f-86c5-cfafb328c79c","Type":"ContainerDied","Data":"9704d59f05eae35673fe373c8ee22f03b0c4a5aaed0685139fe32f447db41b06"} Mar 20 07:56:04 crc kubenswrapper[4749]: I0320 07:56:04.781552 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566556-42bpd" Mar 20 07:56:04 crc kubenswrapper[4749]: I0320 07:56:04.819624 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlccg\" (UniqueName: \"kubernetes.io/projected/8e17fd93-fb61-4e2f-86c5-cfafb328c79c-kube-api-access-wlccg\") pod \"8e17fd93-fb61-4e2f-86c5-cfafb328c79c\" (UID: \"8e17fd93-fb61-4e2f-86c5-cfafb328c79c\") " Mar 20 07:56:04 crc kubenswrapper[4749]: I0320 07:56:04.825428 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e17fd93-fb61-4e2f-86c5-cfafb328c79c-kube-api-access-wlccg" (OuterVolumeSpecName: "kube-api-access-wlccg") pod "8e17fd93-fb61-4e2f-86c5-cfafb328c79c" (UID: "8e17fd93-fb61-4e2f-86c5-cfafb328c79c"). InnerVolumeSpecName "kube-api-access-wlccg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:56:04 crc kubenswrapper[4749]: I0320 07:56:04.920940 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlccg\" (UniqueName: \"kubernetes.io/projected/8e17fd93-fb61-4e2f-86c5-cfafb328c79c-kube-api-access-wlccg\") on node \"crc\" DevicePath \"\"" Mar 20 07:56:05 crc kubenswrapper[4749]: I0320 07:56:05.397133 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566556-42bpd" Mar 20 07:56:05 crc kubenswrapper[4749]: I0320 07:56:05.408412 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566556-42bpd" event={"ID":"8e17fd93-fb61-4e2f-86c5-cfafb328c79c","Type":"ContainerDied","Data":"4b47602dcdab8bb8bf064cf513fe20e812a503c94ecb0e432281efb86955fc61"} Mar 20 07:56:05 crc kubenswrapper[4749]: I0320 07:56:05.408497 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b47602dcdab8bb8bf064cf513fe20e812a503c94ecb0e432281efb86955fc61" Mar 20 07:56:05 crc kubenswrapper[4749]: I0320 07:56:05.439090 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566550-54kkc"] Mar 20 07:56:05 crc kubenswrapper[4749]: I0320 07:56:05.445027 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566550-54kkc"] Mar 20 07:56:06 crc kubenswrapper[4749]: I0320 07:56:06.195429 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b925be05-c5d6-40bf-974b-2964fdc28a95" path="/var/lib/kubelet/pods/b925be05-c5d6-40bf-974b-2964fdc28a95/volumes" Mar 20 07:56:12 crc kubenswrapper[4749]: I0320 07:56:12.177186 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:56:12 crc kubenswrapper[4749]: E0320 07:56:12.177922 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:56:13 crc kubenswrapper[4749]: I0320 07:56:13.177445 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:56:13 crc kubenswrapper[4749]: E0320 07:56:13.178753 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:56:24 crc kubenswrapper[4749]: I0320 07:56:24.184175 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:56:24 crc kubenswrapper[4749]: E0320 07:56:24.184938 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:56:27 crc kubenswrapper[4749]: I0320 07:56:27.177518 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:56:27 crc kubenswrapper[4749]: E0320 07:56:27.178536 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:56:36 crc kubenswrapper[4749]: I0320 07:56:36.205848 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:56:36 crc kubenswrapper[4749]: E0320 07:56:36.206773 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:56:38 crc kubenswrapper[4749]: I0320 07:56:38.177999 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:56:38 crc kubenswrapper[4749]: E0320 07:56:38.178693 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:56:49 crc kubenswrapper[4749]: I0320 07:56:49.178043 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:56:49 crc kubenswrapper[4749]: E0320 07:56:49.179221 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:56:51 crc kubenswrapper[4749]: I0320 07:56:51.177878 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:56:51 crc kubenswrapper[4749]: E0320 07:56:51.178348 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:56:56 crc kubenswrapper[4749]: I0320 07:56:56.336648 4749 scope.go:117] "RemoveContainer" containerID="8ef8ce5580f70e3e0d1ed8b2cf72dc05232022a8f3e65ea8398f09e903439fb9" Mar 20 07:57:01 crc kubenswrapper[4749]: I0320 07:57:01.179907 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:57:01 crc kubenswrapper[4749]: E0320 07:57:01.181226 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:57:05 crc kubenswrapper[4749]: I0320 07:57:05.177424 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:57:05 crc kubenswrapper[4749]: E0320 07:57:05.178047 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:57:12 crc kubenswrapper[4749]: I0320 07:57:12.178509 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:57:12 crc kubenswrapper[4749]: E0320 07:57:12.179567 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:57:20 crc kubenswrapper[4749]: I0320 07:57:20.177377 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:57:20 crc kubenswrapper[4749]: E0320 07:57:20.178600 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:57:24 crc kubenswrapper[4749]: I0320 07:57:24.186808 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:57:24 crc kubenswrapper[4749]: E0320 07:57:24.187305 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:57:31 crc kubenswrapper[4749]: I0320 07:57:31.177163 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:57:31 crc kubenswrapper[4749]: E0320 07:57:31.178095 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:57:34 crc kubenswrapper[4749]: I0320 07:57:34.515258 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:57:34 crc kubenswrapper[4749]: I0320 07:57:34.515644 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:57:36 crc kubenswrapper[4749]: I0320 07:57:36.177824 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:57:36 crc kubenswrapper[4749]: E0320 07:57:36.178502 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:57:42 crc kubenswrapper[4749]: I0320 07:57:42.178466 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:57:42 crc kubenswrapper[4749]: E0320 07:57:42.179526 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:57:47 crc kubenswrapper[4749]: I0320 07:57:47.178176 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:57:47 crc kubenswrapper[4749]: E0320 07:57:47.178965 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:57:54 crc kubenswrapper[4749]: I0320 07:57:54.182541 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:57:54 crc kubenswrapper[4749]: E0320 07:57:54.183251 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:58:00 crc kubenswrapper[4749]: I0320 07:58:00.168236 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566558-j4rfz"] Mar 20 07:58:00 crc kubenswrapper[4749]: E0320 07:58:00.169671 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e17fd93-fb61-4e2f-86c5-cfafb328c79c" containerName="oc" Mar 20 07:58:00 crc kubenswrapper[4749]: I0320 07:58:00.169694 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e17fd93-fb61-4e2f-86c5-cfafb328c79c" containerName="oc" Mar 20 07:58:00 crc kubenswrapper[4749]: I0320 07:58:00.171017 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e17fd93-fb61-4e2f-86c5-cfafb328c79c" containerName="oc" Mar 20 07:58:00 crc kubenswrapper[4749]: I0320 07:58:00.171970 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566558-j4rfz" Mar 20 07:58:00 crc kubenswrapper[4749]: I0320 07:58:00.175332 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 07:58:00 crc kubenswrapper[4749]: I0320 07:58:00.175423 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 07:58:00 crc kubenswrapper[4749]: I0320 07:58:00.183666 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 07:58:00 crc kubenswrapper[4749]: I0320 07:58:00.202877 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566558-j4rfz"] Mar 20 07:58:00 crc kubenswrapper[4749]: I0320 07:58:00.349903 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpc8g\" (UniqueName: \"kubernetes.io/projected/9d894f2e-7d4b-4e30-bfb2-9e50ed440a9e-kube-api-access-mpc8g\") pod \"auto-csr-approver-29566558-j4rfz\" (UID: \"9d894f2e-7d4b-4e30-bfb2-9e50ed440a9e\") " pod="openshift-infra/auto-csr-approver-29566558-j4rfz" Mar 20 07:58:00 crc kubenswrapper[4749]: I0320 07:58:00.451543 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpc8g\" (UniqueName: \"kubernetes.io/projected/9d894f2e-7d4b-4e30-bfb2-9e50ed440a9e-kube-api-access-mpc8g\") pod \"auto-csr-approver-29566558-j4rfz\" (UID: \"9d894f2e-7d4b-4e30-bfb2-9e50ed440a9e\") " pod="openshift-infra/auto-csr-approver-29566558-j4rfz" Mar 20 07:58:00 crc kubenswrapper[4749]: I0320 07:58:00.480555 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpc8g\" (UniqueName: \"kubernetes.io/projected/9d894f2e-7d4b-4e30-bfb2-9e50ed440a9e-kube-api-access-mpc8g\") pod \"auto-csr-approver-29566558-j4rfz\" (UID: \"9d894f2e-7d4b-4e30-bfb2-9e50ed440a9e\") " pod="openshift-infra/auto-csr-approver-29566558-j4rfz" Mar 20 07:58:00 crc kubenswrapper[4749]: I0320 07:58:00.502387 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566558-j4rfz" Mar 20 07:58:00 crc kubenswrapper[4749]: I0320 07:58:00.966403 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566558-j4rfz"] Mar 20 07:58:01 crc kubenswrapper[4749]: I0320 07:58:01.485104 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566558-j4rfz" event={"ID":"9d894f2e-7d4b-4e30-bfb2-9e50ed440a9e","Type":"ContainerStarted","Data":"2b17d3a3cafe194a6052a2103a58d59b0fbbd2c3bd0be0c12e1ae399fe1f6776"} Mar 20 07:58:02 crc kubenswrapper[4749]: I0320 07:58:02.178592 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:58:02 crc kubenswrapper[4749]: E0320 07:58:02.178765 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:58:02 crc kubenswrapper[4749]: I0320 07:58:02.493007 4749 generic.go:334] "Generic (PLEG): container finished" podID="9d894f2e-7d4b-4e30-bfb2-9e50ed440a9e" containerID="99df9f31376559c47d49dd830959e482f2c285c17b804335d85dd870efc37e5c" exitCode=0 Mar 20 07:58:02 crc kubenswrapper[4749]: I0320 07:58:02.493420 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566558-j4rfz" event={"ID":"9d894f2e-7d4b-4e30-bfb2-9e50ed440a9e","Type":"ContainerDied","Data":"99df9f31376559c47d49dd830959e482f2c285c17b804335d85dd870efc37e5c"} Mar 20 07:58:03 crc kubenswrapper[4749]: I0320 07:58:03.928604 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566558-j4rfz" Mar 20 07:58:04 crc kubenswrapper[4749]: I0320 07:58:04.009546 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpc8g\" (UniqueName: \"kubernetes.io/projected/9d894f2e-7d4b-4e30-bfb2-9e50ed440a9e-kube-api-access-mpc8g\") pod \"9d894f2e-7d4b-4e30-bfb2-9e50ed440a9e\" (UID: \"9d894f2e-7d4b-4e30-bfb2-9e50ed440a9e\") " Mar 20 07:58:04 crc kubenswrapper[4749]: I0320 07:58:04.014715 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d894f2e-7d4b-4e30-bfb2-9e50ed440a9e-kube-api-access-mpc8g" (OuterVolumeSpecName: "kube-api-access-mpc8g") pod "9d894f2e-7d4b-4e30-bfb2-9e50ed440a9e" (UID: "9d894f2e-7d4b-4e30-bfb2-9e50ed440a9e"). InnerVolumeSpecName "kube-api-access-mpc8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 07:58:04 crc kubenswrapper[4749]: I0320 07:58:04.112479 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpc8g\" (UniqueName: \"kubernetes.io/projected/9d894f2e-7d4b-4e30-bfb2-9e50ed440a9e-kube-api-access-mpc8g\") on node \"crc\" DevicePath \"\"" Mar 20 07:58:04 crc kubenswrapper[4749]: I0320 07:58:04.515025 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:58:04 crc kubenswrapper[4749]: I0320 07:58:04.515113 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:58:04 crc kubenswrapper[4749]: I0320 07:58:04.519460 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566558-j4rfz" event={"ID":"9d894f2e-7d4b-4e30-bfb2-9e50ed440a9e","Type":"ContainerDied","Data":"2b17d3a3cafe194a6052a2103a58d59b0fbbd2c3bd0be0c12e1ae399fe1f6776"} Mar 20 07:58:04 crc kubenswrapper[4749]: I0320 07:58:04.519508 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b17d3a3cafe194a6052a2103a58d59b0fbbd2c3bd0be0c12e1ae399fe1f6776" Mar 20 07:58:04 crc kubenswrapper[4749]: I0320 07:58:04.519596 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566558-j4rfz" Mar 20 07:58:05 crc kubenswrapper[4749]: I0320 07:58:05.017921 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566552-kw72f"] Mar 20 07:58:05 crc kubenswrapper[4749]: I0320 07:58:05.026142 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566552-kw72f"] Mar 20 07:58:06 crc kubenswrapper[4749]: I0320 07:58:06.194652 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f707152-0ff3-41e6-8b25-1c20197d6867" path="/var/lib/kubelet/pods/1f707152-0ff3-41e6-8b25-1c20197d6867/volumes" Mar 20 07:58:08 crc kubenswrapper[4749]: I0320 07:58:08.179015 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:58:08 crc kubenswrapper[4749]: E0320 07:58:08.179731 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:58:15 crc kubenswrapper[4749]: I0320 07:58:15.177392 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:58:15 crc kubenswrapper[4749]: E0320 07:58:15.177994 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:58:21 crc kubenswrapper[4749]: I0320 07:58:21.177813 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:58:21 crc kubenswrapper[4749]: E0320 07:58:21.179080 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:58:30 crc kubenswrapper[4749]: I0320 07:58:30.177164 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:58:30 crc kubenswrapper[4749]: E0320 07:58:30.177850 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:58:34 crc kubenswrapper[4749]: I0320 07:58:34.191494 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:58:34 crc kubenswrapper[4749]: E0320 07:58:34.193067 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:58:34 crc kubenswrapper[4749]: I0320 07:58:34.515034 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 07:58:34 crc kubenswrapper[4749]: I0320 07:58:34.515525 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 07:58:34 crc kubenswrapper[4749]: I0320 07:58:34.515613 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 07:58:34 crc kubenswrapper[4749]: I0320 07:58:34.516754 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5b4585b69865061d7b06cc9fea1a7b04c408f0d820173decb8cf2fe0fabf3eda"} pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 07:58:34 crc kubenswrapper[4749]: I0320 07:58:34.516871 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" containerID="cri-o://5b4585b69865061d7b06cc9fea1a7b04c408f0d820173decb8cf2fe0fabf3eda" gracePeriod=600 Mar 20 07:58:34 crc kubenswrapper[4749]: I0320 07:58:34.801890 4749 generic.go:334] "Generic (PLEG): container finished" podID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerID="5b4585b69865061d7b06cc9fea1a7b04c408f0d820173decb8cf2fe0fabf3eda" exitCode=0 Mar 20 07:58:34 crc kubenswrapper[4749]: I0320 07:58:34.801937 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerDied","Data":"5b4585b69865061d7b06cc9fea1a7b04c408f0d820173decb8cf2fe0fabf3eda"} Mar 20 07:58:34 crc kubenswrapper[4749]: I0320 07:58:34.801979 4749 scope.go:117] "RemoveContainer" containerID="0ff9be3875797ada3dfffbc86acf0005e90268274e3d10bb1025a8c4c1ddfc14" Mar 20 07:58:35 crc kubenswrapper[4749]: I0320 07:58:35.811445 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerStarted","Data":"ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382"} Mar 20 07:58:43 crc kubenswrapper[4749]: I0320 07:58:43.177507 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:58:43 crc kubenswrapper[4749]: E0320 07:58:43.178358 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:58:49 crc kubenswrapper[4749]: I0320 07:58:49.177697 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:58:49 crc kubenswrapper[4749]: E0320 07:58:49.178636 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:58:56 crc kubenswrapper[4749]: I0320 07:58:56.445444 4749 scope.go:117] "RemoveContainer" containerID="5611713c0870a87be8edd0f11e4c9897f4d9e4b66eed4c5ddb8bfc3c5005575b" Mar 20 07:58:57 crc kubenswrapper[4749]: I0320 07:58:57.179086 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:58:57 crc kubenswrapper[4749]: E0320 07:58:57.180260 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:59:01 crc kubenswrapper[4749]: I0320 07:59:01.177437 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:59:02 crc kubenswrapper[4749]: I0320 07:59:02.073167 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerStarted","Data":"cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3"} Mar 20 07:59:02 crc kubenswrapper[4749]: I0320 07:59:02.073893 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 07:59:06 crc kubenswrapper[4749]: I0320 07:59:06.114747 4749 generic.go:334] "Generic (PLEG): container finished" podID="8b9b402f-2d95-48f5-98d8-497d90956ba2" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" exitCode=0 Mar 20 07:59:06 crc kubenswrapper[4749]: I0320 07:59:06.114835 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerDied","Data":"cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3"} Mar 20 07:59:06 crc kubenswrapper[4749]: I0320 07:59:06.115330 4749 scope.go:117] "RemoveContainer" containerID="fe8a1ac1ed75509497009f93c418eb1f5af14b91d6c6edb5719e30930d18e79e" Mar 20 07:59:06 crc kubenswrapper[4749]: I0320 07:59:06.115994 4749 scope.go:117] "RemoveContainer" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" Mar 20 07:59:06 crc kubenswrapper[4749]: E0320 07:59:06.116264 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:59:08 crc kubenswrapper[4749]: I0320 07:59:08.177800 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:59:09 crc kubenswrapper[4749]: I0320 07:59:09.144301 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerStarted","Data":"ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838"} Mar 20 07:59:09 crc kubenswrapper[4749]: I0320 07:59:09.145164 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 07:59:12 crc kubenswrapper[4749]: I0320 07:59:12.174719 4749 generic.go:334] "Generic (PLEG): container finished" podID="8db06e36-0b00-4157-9345-69449da3e85f" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" exitCode=0 Mar 20 07:59:12 crc kubenswrapper[4749]: I0320 07:59:12.174771 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerDied","Data":"ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838"} Mar 20 07:59:12 crc kubenswrapper[4749]: I0320 07:59:12.174809 4749 scope.go:117] "RemoveContainer" containerID="2c6453e71c1228ecec01603c838419a1b62f8a0f0b9e2708edf3f3871dbb17aa" Mar 20 07:59:12 crc kubenswrapper[4749]: I0320 07:59:12.175886 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 07:59:12 crc kubenswrapper[4749]: E0320 07:59:12.176474 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:59:21 crc kubenswrapper[4749]: I0320 07:59:21.177972 4749 scope.go:117] "RemoveContainer" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" Mar 20 07:59:21 crc kubenswrapper[4749]: E0320 07:59:21.179841 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:59:26 crc kubenswrapper[4749]: I0320 07:59:26.177984 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 07:59:26 crc kubenswrapper[4749]: E0320 07:59:26.178777 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:59:33 crc kubenswrapper[4749]: I0320 07:59:33.177921 4749 scope.go:117] "RemoveContainer" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" Mar 20 07:59:33 crc kubenswrapper[4749]: E0320 07:59:33.178707 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:59:39 crc kubenswrapper[4749]: I0320 07:59:39.177248 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 07:59:39 crc kubenswrapper[4749]: E0320 07:59:39.178080 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:59:47 crc kubenswrapper[4749]: I0320 07:59:47.177500 4749 scope.go:117] "RemoveContainer" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" Mar 20 07:59:47 crc kubenswrapper[4749]: E0320 07:59:47.178387 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 07:59:53 crc kubenswrapper[4749]: I0320 07:59:53.179334 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 07:59:53 crc kubenswrapper[4749]: E0320 07:59:53.180627 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 07:59:55 crc kubenswrapper[4749]: I0320 07:59:55.785079 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bvhdp"] Mar 20 07:59:55 crc kubenswrapper[4749]: E0320 07:59:55.788503 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d894f2e-7d4b-4e30-bfb2-9e50ed440a9e" containerName="oc" Mar 20 07:59:55 crc kubenswrapper[4749]: I0320 07:59:55.788640 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d894f2e-7d4b-4e30-bfb2-9e50ed440a9e" containerName="oc" Mar 20 07:59:55 crc kubenswrapper[4749]: I0320 07:59:55.789002 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d894f2e-7d4b-4e30-bfb2-9e50ed440a9e" containerName="oc" Mar 20 07:59:55 crc kubenswrapper[4749]: I0320 07:59:55.794644 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bvhdp" Mar 20 07:59:55 crc kubenswrapper[4749]: I0320 07:59:55.800947 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bvhdp"] Mar 20 07:59:55 crc kubenswrapper[4749]: I0320 07:59:55.904478 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cf7e126-abcf-4e2c-8049-74789387530f-utilities\") pod \"community-operators-bvhdp\" (UID: \"7cf7e126-abcf-4e2c-8049-74789387530f\") " pod="openshift-marketplace/community-operators-bvhdp" Mar 20 07:59:55 crc kubenswrapper[4749]: I0320 07:59:55.904602 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cf7e126-abcf-4e2c-8049-74789387530f-catalog-content\") pod \"community-operators-bvhdp\" (UID: \"7cf7e126-abcf-4e2c-8049-74789387530f\") " pod="openshift-marketplace/community-operators-bvhdp" Mar 20 07:59:55 crc kubenswrapper[4749]: I0320 07:59:55.904667 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkspb\" (UniqueName: \"kubernetes.io/projected/7cf7e126-abcf-4e2c-8049-74789387530f-kube-api-access-xkspb\") pod \"community-operators-bvhdp\" (UID: \"7cf7e126-abcf-4e2c-8049-74789387530f\") " pod="openshift-marketplace/community-operators-bvhdp" Mar 20 07:59:56 crc kubenswrapper[4749]: I0320 07:59:56.005751 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkspb\" (UniqueName: \"kubernetes.io/projected/7cf7e126-abcf-4e2c-8049-74789387530f-kube-api-access-xkspb\") pod \"community-operators-bvhdp\" (UID: \"7cf7e126-abcf-4e2c-8049-74789387530f\") " pod="openshift-marketplace/community-operators-bvhdp" Mar 20 07:59:56 crc kubenswrapper[4749]: I0320 07:59:56.005851 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cf7e126-abcf-4e2c-8049-74789387530f-utilities\") pod \"community-operators-bvhdp\" (UID: \"7cf7e126-abcf-4e2c-8049-74789387530f\") " pod="openshift-marketplace/community-operators-bvhdp" Mar 20 07:59:56 crc kubenswrapper[4749]: I0320 07:59:56.005905 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cf7e126-abcf-4e2c-8049-74789387530f-catalog-content\") pod \"community-operators-bvhdp\" (UID: \"7cf7e126-abcf-4e2c-8049-74789387530f\") " pod="openshift-marketplace/community-operators-bvhdp" Mar 20 07:59:56 crc kubenswrapper[4749]: I0320 07:59:56.006481 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cf7e126-abcf-4e2c-8049-74789387530f-catalog-content\") pod \"community-operators-bvhdp\" (UID: \"7cf7e126-abcf-4e2c-8049-74789387530f\") " pod="openshift-marketplace/community-operators-bvhdp" Mar 20 07:59:56 crc kubenswrapper[4749]: I0320 07:59:56.006591 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cf7e126-abcf-4e2c-8049-74789387530f-utilities\") pod \"community-operators-bvhdp\" (UID: \"7cf7e126-abcf-4e2c-8049-74789387530f\") " pod="openshift-marketplace/community-operators-bvhdp" Mar 20 07:59:56 crc kubenswrapper[4749]: I0320 07:59:56.042692 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkspb\" (UniqueName: \"kubernetes.io/projected/7cf7e126-abcf-4e2c-8049-74789387530f-kube-api-access-xkspb\") pod \"community-operators-bvhdp\" (UID: \"7cf7e126-abcf-4e2c-8049-74789387530f\") " pod="openshift-marketplace/community-operators-bvhdp" Mar 20 07:59:56 crc kubenswrapper[4749]: I0320 07:59:56.114658 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bvhdp" Mar 20 07:59:56 crc kubenswrapper[4749]: I0320 07:59:56.660101 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bvhdp"] Mar 20 07:59:56 crc kubenswrapper[4749]: W0320 07:59:56.672917 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cf7e126_abcf_4e2c_8049_74789387530f.slice/crio-afa2224ebb02f33ef736f141a9208380ff8fc8b0fdbb8ceee52c24ada5124420 WatchSource:0}: Error finding container afa2224ebb02f33ef736f141a9208380ff8fc8b0fdbb8ceee52c24ada5124420: Status 404 returned error can't find the container with id afa2224ebb02f33ef736f141a9208380ff8fc8b0fdbb8ceee52c24ada5124420 Mar 20 07:59:57 crc kubenswrapper[4749]: I0320 07:59:57.616440 4749 generic.go:334] "Generic (PLEG): container finished" podID="7cf7e126-abcf-4e2c-8049-74789387530f" containerID="074490293c61000178c3de5ce2a74090bb09f27f140a0cbffacd7f7fd2ab5404" exitCode=0 Mar 20 07:59:57 crc kubenswrapper[4749]: I0320 07:59:57.616590 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bvhdp" event={"ID":"7cf7e126-abcf-4e2c-8049-74789387530f","Type":"ContainerDied","Data":"074490293c61000178c3de5ce2a74090bb09f27f140a0cbffacd7f7fd2ab5404"} Mar 20 07:59:57 crc kubenswrapper[4749]: I0320 07:59:57.616822 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bvhdp" event={"ID":"7cf7e126-abcf-4e2c-8049-74789387530f","Type":"ContainerStarted","Data":"afa2224ebb02f33ef736f141a9208380ff8fc8b0fdbb8ceee52c24ada5124420"} Mar 20 07:59:58 crc kubenswrapper[4749]: I0320 07:59:58.624762 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bvhdp" event={"ID":"7cf7e126-abcf-4e2c-8049-74789387530f","Type":"ContainerStarted","Data":"ba2269494aaef401f954d0f972285e86a961760a9125a11af23db41b03022686"} Mar 20 07:59:59 crc kubenswrapper[4749]: I0320 07:59:59.635566 4749 generic.go:334] "Generic (PLEG): container finished" podID="7cf7e126-abcf-4e2c-8049-74789387530f" containerID="ba2269494aaef401f954d0f972285e86a961760a9125a11af23db41b03022686" exitCode=0 Mar 20 07:59:59 crc kubenswrapper[4749]: I0320 07:59:59.635626 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bvhdp" event={"ID":"7cf7e126-abcf-4e2c-8049-74789387530f","Type":"ContainerDied","Data":"ba2269494aaef401f954d0f972285e86a961760a9125a11af23db41b03022686"} Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.160568 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566560-z8j56"] Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.162522 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-z8j56" Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.164447 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.169495 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.173450 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566560-sn5mp"] Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.174706 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566560-sn5mp" Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.177496 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.177487 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.177556 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.190794 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566560-sn5mp"] Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.191652 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566560-z8j56"] Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.289451 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm4qc\" (UniqueName: \"kubernetes.io/projected/0f628e37-3cdd-490e-9b3f-044361a821bc-kube-api-access-vm4qc\") pod \"collect-profiles-29566560-z8j56\" (UID: \"0f628e37-3cdd-490e-9b3f-044361a821bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-z8j56" Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.289589 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xl5f\" (UniqueName: \"kubernetes.io/projected/a6b9b776-816a-4d03-9d2a-1933104588ad-kube-api-access-6xl5f\") pod \"auto-csr-approver-29566560-sn5mp\" (UID: \"a6b9b776-816a-4d03-9d2a-1933104588ad\") " pod="openshift-infra/auto-csr-approver-29566560-sn5mp" Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.289619 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f628e37-3cdd-490e-9b3f-044361a821bc-secret-volume\") pod \"collect-profiles-29566560-z8j56\" (UID: \"0f628e37-3cdd-490e-9b3f-044361a821bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-z8j56" Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.289661 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f628e37-3cdd-490e-9b3f-044361a821bc-config-volume\") pod \"collect-profiles-29566560-z8j56\" (UID: \"0f628e37-3cdd-490e-9b3f-044361a821bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-z8j56" Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.390984 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f628e37-3cdd-490e-9b3f-044361a821bc-config-volume\") pod \"collect-profiles-29566560-z8j56\" (UID: \"0f628e37-3cdd-490e-9b3f-044361a821bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-z8j56" Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.391119 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm4qc\" (UniqueName: \"kubernetes.io/projected/0f628e37-3cdd-490e-9b3f-044361a821bc-kube-api-access-vm4qc\") pod \"collect-profiles-29566560-z8j56\" (UID: \"0f628e37-3cdd-490e-9b3f-044361a821bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-z8j56" Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.391186 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xl5f\" (UniqueName: \"kubernetes.io/projected/a6b9b776-816a-4d03-9d2a-1933104588ad-kube-api-access-6xl5f\") pod \"auto-csr-approver-29566560-sn5mp\" (UID: \"a6b9b776-816a-4d03-9d2a-1933104588ad\") " pod="openshift-infra/auto-csr-approver-29566560-sn5mp" Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.391210 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f628e37-3cdd-490e-9b3f-044361a821bc-secret-volume\") pod \"collect-profiles-29566560-z8j56\" (UID: \"0f628e37-3cdd-490e-9b3f-044361a821bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-z8j56" Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.391979 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f628e37-3cdd-490e-9b3f-044361a821bc-config-volume\") pod \"collect-profiles-29566560-z8j56\" (UID: \"0f628e37-3cdd-490e-9b3f-044361a821bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-z8j56" Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.396777 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f628e37-3cdd-490e-9b3f-044361a821bc-secret-volume\") pod \"collect-profiles-29566560-z8j56\" (UID: \"0f628e37-3cdd-490e-9b3f-044361a821bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-z8j56" Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.407592 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm4qc\" (UniqueName: \"kubernetes.io/projected/0f628e37-3cdd-490e-9b3f-044361a821bc-kube-api-access-vm4qc\") pod \"collect-profiles-29566560-z8j56\" (UID: \"0f628e37-3cdd-490e-9b3f-044361a821bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-z8j56" Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.407873 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xl5f\" (UniqueName: \"kubernetes.io/projected/a6b9b776-816a-4d03-9d2a-1933104588ad-kube-api-access-6xl5f\") pod \"auto-csr-approver-29566560-sn5mp\" (UID: \"a6b9b776-816a-4d03-9d2a-1933104588ad\") " pod="openshift-infra/auto-csr-approver-29566560-sn5mp" Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.488300 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-z8j56" Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.514820 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566560-sn5mp" Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.662131 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bvhdp" event={"ID":"7cf7e126-abcf-4e2c-8049-74789387530f","Type":"ContainerStarted","Data":"03ea586dbd8ff77a19f725b9e8507aedc5c9995019a742b33c5f90fa07ebdd5c"} Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.685412 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bvhdp" podStartSLOduration=3.236829998 podStartE2EDuration="5.685394451s" podCreationTimestamp="2026-03-20 07:59:55 +0000 UTC" firstStartedPulling="2026-03-20 07:59:57.619606956 +0000 UTC m=+2834.169264603" lastFinishedPulling="2026-03-20 08:00:00.068171399 +0000 UTC m=+2836.617829056" observedRunningTime="2026-03-20 08:00:00.683398644 +0000 UTC m=+2837.233056331" watchObservedRunningTime="2026-03-20 08:00:00.685394451 +0000 UTC m=+2837.235052098" Mar 20 08:00:00 crc kubenswrapper[4749]: W0320 08:00:00.974715 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f628e37_3cdd_490e_9b3f_044361a821bc.slice/crio-40de0e7d004ef451ed7f33ac626edf7947dc0a5c813e321a3099d143ee26da24 WatchSource:0}: Error finding container 40de0e7d004ef451ed7f33ac626edf7947dc0a5c813e321a3099d143ee26da24: Status 404 returned error can't find the container with id 40de0e7d004ef451ed7f33ac626edf7947dc0a5c813e321a3099d143ee26da24 Mar 20 08:00:00 crc kubenswrapper[4749]: I0320 08:00:00.975239 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566560-z8j56"] Mar 20 08:00:01 crc kubenswrapper[4749]: I0320 08:00:01.051515 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566560-sn5mp"] Mar 20 08:00:01 crc kubenswrapper[4749]: W0320 08:00:01.058215 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6b9b776_816a_4d03_9d2a_1933104588ad.slice/crio-b58b67c59e121e5c9c199aed955bff15ca83c2399fa05782e0c2e97bf80aa209 WatchSource:0}: Error finding container b58b67c59e121e5c9c199aed955bff15ca83c2399fa05782e0c2e97bf80aa209: Status 404 returned error can't find the container with id b58b67c59e121e5c9c199aed955bff15ca83c2399fa05782e0c2e97bf80aa209 Mar 20 08:00:01 crc kubenswrapper[4749]: I0320 08:00:01.675433 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566560-sn5mp" event={"ID":"a6b9b776-816a-4d03-9d2a-1933104588ad","Type":"ContainerStarted","Data":"b58b67c59e121e5c9c199aed955bff15ca83c2399fa05782e0c2e97bf80aa209"} Mar 20 08:00:01 crc kubenswrapper[4749]: I0320 08:00:01.677873 4749 generic.go:334] "Generic (PLEG): container finished" podID="0f628e37-3cdd-490e-9b3f-044361a821bc" containerID="539acf19a6b27b96a6196355e7123bd33181077d89f2bdd5cf4a494fd0cdf012" exitCode=0 Mar 20 08:00:01 crc kubenswrapper[4749]: I0320 08:00:01.678547 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-z8j56" event={"ID":"0f628e37-3cdd-490e-9b3f-044361a821bc","Type":"ContainerDied","Data":"539acf19a6b27b96a6196355e7123bd33181077d89f2bdd5cf4a494fd0cdf012"} Mar 20 08:00:01 crc kubenswrapper[4749]: I0320 08:00:01.678599 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-z8j56" event={"ID":"0f628e37-3cdd-490e-9b3f-044361a821bc","Type":"ContainerStarted","Data":"40de0e7d004ef451ed7f33ac626edf7947dc0a5c813e321a3099d143ee26da24"} Mar 20 08:00:02 crc kubenswrapper[4749]: I0320 08:00:02.178199 4749 scope.go:117] "RemoveContainer" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" Mar 20 08:00:02 crc kubenswrapper[4749]: E0320 08:00:02.178432 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:00:03 crc kubenswrapper[4749]: I0320 08:00:03.071380 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-z8j56" Mar 20 08:00:03 crc kubenswrapper[4749]: I0320 08:00:03.237249 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm4qc\" (UniqueName: \"kubernetes.io/projected/0f628e37-3cdd-490e-9b3f-044361a821bc-kube-api-access-vm4qc\") pod \"0f628e37-3cdd-490e-9b3f-044361a821bc\" (UID: \"0f628e37-3cdd-490e-9b3f-044361a821bc\") " Mar 20 08:00:03 crc kubenswrapper[4749]: I0320 08:00:03.237376 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f628e37-3cdd-490e-9b3f-044361a821bc-config-volume\") pod \"0f628e37-3cdd-490e-9b3f-044361a821bc\" (UID: \"0f628e37-3cdd-490e-9b3f-044361a821bc\") " Mar 20 08:00:03 crc kubenswrapper[4749]: I0320 08:00:03.237506 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f628e37-3cdd-490e-9b3f-044361a821bc-secret-volume\") pod \"0f628e37-3cdd-490e-9b3f-044361a821bc\" (UID: \"0f628e37-3cdd-490e-9b3f-044361a821bc\") " Mar 20 08:00:03 crc kubenswrapper[4749]: I0320 08:00:03.238674 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f628e37-3cdd-490e-9b3f-044361a821bc-config-volume" (OuterVolumeSpecName: "config-volume") pod "0f628e37-3cdd-490e-9b3f-044361a821bc" (UID: "0f628e37-3cdd-490e-9b3f-044361a821bc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:00:03 crc kubenswrapper[4749]: I0320 08:00:03.243722 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f628e37-3cdd-490e-9b3f-044361a821bc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0f628e37-3cdd-490e-9b3f-044361a821bc" (UID: "0f628e37-3cdd-490e-9b3f-044361a821bc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:00:03 crc kubenswrapper[4749]: I0320 08:00:03.244030 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f628e37-3cdd-490e-9b3f-044361a821bc-kube-api-access-vm4qc" (OuterVolumeSpecName: "kube-api-access-vm4qc") pod "0f628e37-3cdd-490e-9b3f-044361a821bc" (UID: "0f628e37-3cdd-490e-9b3f-044361a821bc"). InnerVolumeSpecName "kube-api-access-vm4qc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:00:03 crc kubenswrapper[4749]: I0320 08:00:03.339533 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f628e37-3cdd-490e-9b3f-044361a821bc-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 08:00:03 crc kubenswrapper[4749]: I0320 08:00:03.339591 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm4qc\" (UniqueName: \"kubernetes.io/projected/0f628e37-3cdd-490e-9b3f-044361a821bc-kube-api-access-vm4qc\") on node \"crc\" DevicePath \"\"" Mar 20 08:00:03 crc kubenswrapper[4749]: I0320 08:00:03.339645 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f628e37-3cdd-490e-9b3f-044361a821bc-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 08:00:03 crc kubenswrapper[4749]: I0320 08:00:03.696693 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-z8j56" event={"ID":"0f628e37-3cdd-490e-9b3f-044361a821bc","Type":"ContainerDied","Data":"40de0e7d004ef451ed7f33ac626edf7947dc0a5c813e321a3099d143ee26da24"} Mar 20 08:00:03 crc kubenswrapper[4749]: I0320 08:00:03.696733 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40de0e7d004ef451ed7f33ac626edf7947dc0a5c813e321a3099d143ee26da24" Mar 20 08:00:03 crc kubenswrapper[4749]: I0320 08:00:03.696763 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566560-z8j56" Mar 20 08:00:04 crc kubenswrapper[4749]: I0320 08:00:04.168698 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566515-6hw5x"] Mar 20 08:00:04 crc kubenswrapper[4749]: I0320 08:00:04.176613 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566515-6hw5x"] Mar 20 08:00:04 crc kubenswrapper[4749]: I0320 08:00:04.188542 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2af8695f-a945-411d-ac95-03191fb3080d" path="/var/lib/kubelet/pods/2af8695f-a945-411d-ac95-03191fb3080d/volumes" Mar 20 08:00:06 crc kubenswrapper[4749]: I0320 08:00:06.115222 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bvhdp" Mar 20 08:00:06 crc kubenswrapper[4749]: I0320 08:00:06.115264 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bvhdp" Mar 20 08:00:06 crc kubenswrapper[4749]: I0320 08:00:06.187147 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bvhdp" Mar 20 08:00:06 crc kubenswrapper[4749]: I0320 08:00:06.784884 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bvhdp" Mar 20 08:00:06 crc kubenswrapper[4749]: I0320 08:00:06.839103 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bvhdp"] Mar 20 08:00:07 crc kubenswrapper[4749]: I0320 08:00:07.742227 4749 generic.go:334] "Generic (PLEG): container finished" podID="a6b9b776-816a-4d03-9d2a-1933104588ad" containerID="7c166719e7009a22ece92a2695fff8bc425b2d9889214284173ce3b96d182ebd" exitCode=0 Mar 20 08:00:07 crc kubenswrapper[4749]: I0320 08:00:07.742327 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566560-sn5mp" event={"ID":"a6b9b776-816a-4d03-9d2a-1933104588ad","Type":"ContainerDied","Data":"7c166719e7009a22ece92a2695fff8bc425b2d9889214284173ce3b96d182ebd"} Mar 20 08:00:08 crc kubenswrapper[4749]: I0320 08:00:08.177003 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 08:00:08 crc kubenswrapper[4749]: E0320 08:00:08.177349 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:00:08 crc kubenswrapper[4749]: I0320 08:00:08.754402 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bvhdp" podUID="7cf7e126-abcf-4e2c-8049-74789387530f" containerName="registry-server" containerID="cri-o://03ea586dbd8ff77a19f725b9e8507aedc5c9995019a742b33c5f90fa07ebdd5c" gracePeriod=2 Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.112677 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566560-sn5mp" Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.246647 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xl5f\" (UniqueName: \"kubernetes.io/projected/a6b9b776-816a-4d03-9d2a-1933104588ad-kube-api-access-6xl5f\") pod \"a6b9b776-816a-4d03-9d2a-1933104588ad\" (UID: \"a6b9b776-816a-4d03-9d2a-1933104588ad\") " Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.252842 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6b9b776-816a-4d03-9d2a-1933104588ad-kube-api-access-6xl5f" (OuterVolumeSpecName: "kube-api-access-6xl5f") pod "a6b9b776-816a-4d03-9d2a-1933104588ad" (UID: "a6b9b776-816a-4d03-9d2a-1933104588ad"). InnerVolumeSpecName "kube-api-access-6xl5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.268242 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bvhdp" Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.348785 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkspb\" (UniqueName: \"kubernetes.io/projected/7cf7e126-abcf-4e2c-8049-74789387530f-kube-api-access-xkspb\") pod \"7cf7e126-abcf-4e2c-8049-74789387530f\" (UID: \"7cf7e126-abcf-4e2c-8049-74789387530f\") " Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.348981 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cf7e126-abcf-4e2c-8049-74789387530f-catalog-content\") pod \"7cf7e126-abcf-4e2c-8049-74789387530f\" (UID: \"7cf7e126-abcf-4e2c-8049-74789387530f\") " Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.349122 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cf7e126-abcf-4e2c-8049-74789387530f-utilities\") pod \"7cf7e126-abcf-4e2c-8049-74789387530f\" (UID: \"7cf7e126-abcf-4e2c-8049-74789387530f\") " Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.349568 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xl5f\" (UniqueName: \"kubernetes.io/projected/a6b9b776-816a-4d03-9d2a-1933104588ad-kube-api-access-6xl5f\") on node \"crc\" DevicePath \"\"" Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.351518 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cf7e126-abcf-4e2c-8049-74789387530f-utilities" (OuterVolumeSpecName: "utilities") pod "7cf7e126-abcf-4e2c-8049-74789387530f" (UID: "7cf7e126-abcf-4e2c-8049-74789387530f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.354851 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cf7e126-abcf-4e2c-8049-74789387530f-kube-api-access-xkspb" (OuterVolumeSpecName: "kube-api-access-xkspb") pod "7cf7e126-abcf-4e2c-8049-74789387530f" (UID: "7cf7e126-abcf-4e2c-8049-74789387530f"). InnerVolumeSpecName "kube-api-access-xkspb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.451488 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cf7e126-abcf-4e2c-8049-74789387530f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.451536 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkspb\" (UniqueName: \"kubernetes.io/projected/7cf7e126-abcf-4e2c-8049-74789387530f-kube-api-access-xkspb\") on node \"crc\" DevicePath \"\"" Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.766768 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cf7e126-abcf-4e2c-8049-74789387530f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7cf7e126-abcf-4e2c-8049-74789387530f" (UID: "7cf7e126-abcf-4e2c-8049-74789387530f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.769487 4749 generic.go:334] "Generic (PLEG): container finished" podID="7cf7e126-abcf-4e2c-8049-74789387530f" containerID="03ea586dbd8ff77a19f725b9e8507aedc5c9995019a742b33c5f90fa07ebdd5c" exitCode=0 Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.769575 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bvhdp" event={"ID":"7cf7e126-abcf-4e2c-8049-74789387530f","Type":"ContainerDied","Data":"03ea586dbd8ff77a19f725b9e8507aedc5c9995019a742b33c5f90fa07ebdd5c"} Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.769607 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bvhdp" event={"ID":"7cf7e126-abcf-4e2c-8049-74789387530f","Type":"ContainerDied","Data":"afa2224ebb02f33ef736f141a9208380ff8fc8b0fdbb8ceee52c24ada5124420"} Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.769626 4749 scope.go:117] "RemoveContainer" containerID="03ea586dbd8ff77a19f725b9e8507aedc5c9995019a742b33c5f90fa07ebdd5c" Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.769772 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bvhdp" Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.785542 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566560-sn5mp" event={"ID":"a6b9b776-816a-4d03-9d2a-1933104588ad","Type":"ContainerDied","Data":"b58b67c59e121e5c9c199aed955bff15ca83c2399fa05782e0c2e97bf80aa209"} Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.785580 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b58b67c59e121e5c9c199aed955bff15ca83c2399fa05782e0c2e97bf80aa209" Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.785661 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566560-sn5mp" Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.816761 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bvhdp"] Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.824757 4749 scope.go:117] "RemoveContainer" containerID="ba2269494aaef401f954d0f972285e86a961760a9125a11af23db41b03022686" Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.845259 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bvhdp"] Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.855445 4749 scope.go:117] "RemoveContainer" containerID="074490293c61000178c3de5ce2a74090bb09f27f140a0cbffacd7f7fd2ab5404" Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.856771 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cf7e126-abcf-4e2c-8049-74789387530f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.884319 4749 scope.go:117] "RemoveContainer" containerID="03ea586dbd8ff77a19f725b9e8507aedc5c9995019a742b33c5f90fa07ebdd5c" Mar 20 08:00:09 crc kubenswrapper[4749]: E0320 08:00:09.884936 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03ea586dbd8ff77a19f725b9e8507aedc5c9995019a742b33c5f90fa07ebdd5c\": container with ID starting with 03ea586dbd8ff77a19f725b9e8507aedc5c9995019a742b33c5f90fa07ebdd5c not found: ID does not exist" containerID="03ea586dbd8ff77a19f725b9e8507aedc5c9995019a742b33c5f90fa07ebdd5c" Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.885077 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03ea586dbd8ff77a19f725b9e8507aedc5c9995019a742b33c5f90fa07ebdd5c"} err="failed to get container status \"03ea586dbd8ff77a19f725b9e8507aedc5c9995019a742b33c5f90fa07ebdd5c\": rpc error: code = NotFound desc = could not find container \"03ea586dbd8ff77a19f725b9e8507aedc5c9995019a742b33c5f90fa07ebdd5c\": container with ID starting with 03ea586dbd8ff77a19f725b9e8507aedc5c9995019a742b33c5f90fa07ebdd5c not found: ID does not exist" Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.885193 4749 scope.go:117] "RemoveContainer" containerID="ba2269494aaef401f954d0f972285e86a961760a9125a11af23db41b03022686" Mar 20 08:00:09 crc kubenswrapper[4749]: E0320 08:00:09.885716 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba2269494aaef401f954d0f972285e86a961760a9125a11af23db41b03022686\": container with ID starting with ba2269494aaef401f954d0f972285e86a961760a9125a11af23db41b03022686 not found: ID does not exist" containerID="ba2269494aaef401f954d0f972285e86a961760a9125a11af23db41b03022686" Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.885779 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba2269494aaef401f954d0f972285e86a961760a9125a11af23db41b03022686"} err="failed to get container status \"ba2269494aaef401f954d0f972285e86a961760a9125a11af23db41b03022686\": rpc error: code = NotFound desc = could not find container \"ba2269494aaef401f954d0f972285e86a961760a9125a11af23db41b03022686\": container with ID starting with ba2269494aaef401f954d0f972285e86a961760a9125a11af23db41b03022686 not found: ID does not exist" Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.885806 4749 scope.go:117] "RemoveContainer" containerID="074490293c61000178c3de5ce2a74090bb09f27f140a0cbffacd7f7fd2ab5404" Mar 20 08:00:09 crc kubenswrapper[4749]: E0320 08:00:09.886228 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"074490293c61000178c3de5ce2a74090bb09f27f140a0cbffacd7f7fd2ab5404\": container with ID starting with 074490293c61000178c3de5ce2a74090bb09f27f140a0cbffacd7f7fd2ab5404 not found: ID does not exist" containerID="074490293c61000178c3de5ce2a74090bb09f27f140a0cbffacd7f7fd2ab5404" Mar 20 08:00:09 crc kubenswrapper[4749]: I0320 08:00:09.886402 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"074490293c61000178c3de5ce2a74090bb09f27f140a0cbffacd7f7fd2ab5404"} err="failed to get container status \"074490293c61000178c3de5ce2a74090bb09f27f140a0cbffacd7f7fd2ab5404\": rpc error: code = NotFound desc = could not find container \"074490293c61000178c3de5ce2a74090bb09f27f140a0cbffacd7f7fd2ab5404\": container with ID starting with 074490293c61000178c3de5ce2a74090bb09f27f140a0cbffacd7f7fd2ab5404 not found: ID does not exist" Mar 20 08:00:10 crc kubenswrapper[4749]: I0320 08:00:10.189811 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cf7e126-abcf-4e2c-8049-74789387530f" path="/var/lib/kubelet/pods/7cf7e126-abcf-4e2c-8049-74789387530f/volumes" Mar 20 08:00:10 crc kubenswrapper[4749]: I0320 08:00:10.190853 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566554-7qzth"] Mar 20 08:00:10 crc kubenswrapper[4749]: I0320 08:00:10.193645 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566554-7qzth"] Mar 20 08:00:12 crc kubenswrapper[4749]: I0320 08:00:12.190149 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="463568fe-3529-4730-a2e6-6317aef7e2a7" path="/var/lib/kubelet/pods/463568fe-3529-4730-a2e6-6317aef7e2a7/volumes" Mar 20 08:00:17 crc kubenswrapper[4749]: I0320 08:00:17.177408 4749 scope.go:117] "RemoveContainer" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" Mar 20 08:00:17 crc kubenswrapper[4749]: E0320 08:00:17.178018 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:00:20 crc kubenswrapper[4749]: I0320 08:00:20.177474 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 08:00:20 crc kubenswrapper[4749]: E0320 08:00:20.178130 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:00:31 crc kubenswrapper[4749]: I0320 08:00:31.177400 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 08:00:31 crc kubenswrapper[4749]: E0320 08:00:31.178142 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:00:32 crc kubenswrapper[4749]: I0320 08:00:32.178347 4749 scope.go:117] "RemoveContainer" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" Mar 20 08:00:32 crc kubenswrapper[4749]: E0320 08:00:32.178827 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:00:34 crc kubenswrapper[4749]: I0320 08:00:34.517498 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:00:34 crc kubenswrapper[4749]: I0320 08:00:34.517600 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:00:45 crc kubenswrapper[4749]: I0320 08:00:45.177442 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 08:00:45 crc kubenswrapper[4749]: I0320 08:00:45.178418 4749 scope.go:117] "RemoveContainer" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" Mar 20 08:00:45 crc kubenswrapper[4749]: E0320 08:00:45.178861 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:00:45 crc kubenswrapper[4749]: E0320 08:00:45.178862 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:00:56 crc kubenswrapper[4749]: I0320 08:00:56.539426 4749 scope.go:117] "RemoveContainer" containerID="04c94989401b3e171aa790f44e5e33a1bb2df54d0c87db557e5ced47d88d57d8" Mar 20 08:00:56 crc kubenswrapper[4749]: I0320 08:00:56.570178 4749 scope.go:117] "RemoveContainer" containerID="55797b51668cbef0733402e66e5942cf6e000dd7dad4463f624bd912f5188d36" Mar 20 08:00:57 crc kubenswrapper[4749]: I0320 08:00:57.221117 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bbxzx"] Mar 20 08:00:57 crc kubenswrapper[4749]: E0320 08:00:57.221599 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f628e37-3cdd-490e-9b3f-044361a821bc" containerName="collect-profiles" Mar 20 08:00:57 crc kubenswrapper[4749]: I0320 08:00:57.221620 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f628e37-3cdd-490e-9b3f-044361a821bc" containerName="collect-profiles" Mar 20 08:00:57 crc kubenswrapper[4749]: E0320 08:00:57.221638 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf7e126-abcf-4e2c-8049-74789387530f" containerName="extract-content" Mar 20 08:00:57 crc kubenswrapper[4749]: I0320 08:00:57.221650 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf7e126-abcf-4e2c-8049-74789387530f" containerName="extract-content" Mar 20 08:00:57 crc kubenswrapper[4749]: E0320 08:00:57.221683 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf7e126-abcf-4e2c-8049-74789387530f" containerName="extract-utilities" Mar 20 08:00:57 crc kubenswrapper[4749]: I0320 08:00:57.221696 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf7e126-abcf-4e2c-8049-74789387530f" containerName="extract-utilities" Mar 20 08:00:57 crc kubenswrapper[4749]: E0320 08:00:57.221723 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6b9b776-816a-4d03-9d2a-1933104588ad" containerName="oc" Mar 20 08:00:57 crc kubenswrapper[4749]: I0320 08:00:57.221736 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6b9b776-816a-4d03-9d2a-1933104588ad" containerName="oc" Mar 20 08:00:57 crc kubenswrapper[4749]: E0320 08:00:57.221779 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf7e126-abcf-4e2c-8049-74789387530f" containerName="registry-server" Mar 20 08:00:57 crc kubenswrapper[4749]: I0320 08:00:57.221791 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf7e126-abcf-4e2c-8049-74789387530f" containerName="registry-server" Mar 20 08:00:57 crc kubenswrapper[4749]: I0320 08:00:57.222050 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf7e126-abcf-4e2c-8049-74789387530f" containerName="registry-server" Mar 20 08:00:57 crc kubenswrapper[4749]: I0320 08:00:57.222079 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f628e37-3cdd-490e-9b3f-044361a821bc" containerName="collect-profiles" Mar 20 08:00:57 crc kubenswrapper[4749]: I0320 08:00:57.222119 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6b9b776-816a-4d03-9d2a-1933104588ad" containerName="oc" Mar 20 08:00:57 crc kubenswrapper[4749]: I0320 08:00:57.224001 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbxzx" Mar 20 08:00:57 crc kubenswrapper[4749]: I0320 08:00:57.252815 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bbxzx"] Mar 20 08:00:57 crc kubenswrapper[4749]: I0320 08:00:57.369695 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c06efc2-f581-474c-bd22-1b8f6d4625fa-utilities\") pod \"certified-operators-bbxzx\" (UID: \"7c06efc2-f581-474c-bd22-1b8f6d4625fa\") " pod="openshift-marketplace/certified-operators-bbxzx" Mar 20 08:00:57 crc kubenswrapper[4749]: I0320 08:00:57.369836 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c06efc2-f581-474c-bd22-1b8f6d4625fa-catalog-content\") pod \"certified-operators-bbxzx\" (UID: \"7c06efc2-f581-474c-bd22-1b8f6d4625fa\") " pod="openshift-marketplace/certified-operators-bbxzx" Mar 20 08:00:57 crc kubenswrapper[4749]: I0320 08:00:57.369984 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s44sd\" (UniqueName: \"kubernetes.io/projected/7c06efc2-f581-474c-bd22-1b8f6d4625fa-kube-api-access-s44sd\") pod \"certified-operators-bbxzx\" (UID: \"7c06efc2-f581-474c-bd22-1b8f6d4625fa\") " pod="openshift-marketplace/certified-operators-bbxzx" Mar 20 08:00:57 crc kubenswrapper[4749]: I0320 08:00:57.471536 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c06efc2-f581-474c-bd22-1b8f6d4625fa-catalog-content\") pod \"certified-operators-bbxzx\" (UID: \"7c06efc2-f581-474c-bd22-1b8f6d4625fa\") " pod="openshift-marketplace/certified-operators-bbxzx" Mar 20 08:00:57 crc kubenswrapper[4749]: I0320 08:00:57.471663 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s44sd\" (UniqueName: \"kubernetes.io/projected/7c06efc2-f581-474c-bd22-1b8f6d4625fa-kube-api-access-s44sd\") pod \"certified-operators-bbxzx\" (UID: \"7c06efc2-f581-474c-bd22-1b8f6d4625fa\") " pod="openshift-marketplace/certified-operators-bbxzx" Mar 20 08:00:57 crc kubenswrapper[4749]: I0320 08:00:57.471699 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c06efc2-f581-474c-bd22-1b8f6d4625fa-utilities\") pod \"certified-operators-bbxzx\" (UID: \"7c06efc2-f581-474c-bd22-1b8f6d4625fa\") " pod="openshift-marketplace/certified-operators-bbxzx" Mar 20 08:00:57 crc kubenswrapper[4749]: I0320 08:00:57.472105 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c06efc2-f581-474c-bd22-1b8f6d4625fa-catalog-content\") pod \"certified-operators-bbxzx\" (UID: \"7c06efc2-f581-474c-bd22-1b8f6d4625fa\") " pod="openshift-marketplace/certified-operators-bbxzx" Mar 20 08:00:57 crc kubenswrapper[4749]: I0320 08:00:57.472135 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c06efc2-f581-474c-bd22-1b8f6d4625fa-utilities\") pod \"certified-operators-bbxzx\" (UID: \"7c06efc2-f581-474c-bd22-1b8f6d4625fa\") " pod="openshift-marketplace/certified-operators-bbxzx" Mar 20 08:00:57 crc kubenswrapper[4749]: I0320 08:00:57.491679 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s44sd\" (UniqueName: \"kubernetes.io/projected/7c06efc2-f581-474c-bd22-1b8f6d4625fa-kube-api-access-s44sd\") pod \"certified-operators-bbxzx\" (UID: \"7c06efc2-f581-474c-bd22-1b8f6d4625fa\") " pod="openshift-marketplace/certified-operators-bbxzx" Mar 20 08:00:57 crc kubenswrapper[4749]: I0320 08:00:57.551588 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbxzx" Mar 20 08:00:58 crc kubenswrapper[4749]: I0320 08:00:58.042725 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bbxzx"] Mar 20 08:00:58 crc kubenswrapper[4749]: I0320 08:00:58.177009 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 08:00:58 crc kubenswrapper[4749]: E0320 08:00:58.177939 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:00:58 crc kubenswrapper[4749]: I0320 08:00:58.294576 4749 generic.go:334] "Generic (PLEG): container finished" podID="7c06efc2-f581-474c-bd22-1b8f6d4625fa" containerID="5daf76e3735d3c9bd53bbc2fb1bff963e7ffdbfd1d30886442b04e4c6dd4cda5" exitCode=0 Mar 20 08:00:58 crc kubenswrapper[4749]: I0320 08:00:58.294618 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbxzx" event={"ID":"7c06efc2-f581-474c-bd22-1b8f6d4625fa","Type":"ContainerDied","Data":"5daf76e3735d3c9bd53bbc2fb1bff963e7ffdbfd1d30886442b04e4c6dd4cda5"} Mar 20 08:00:58 crc kubenswrapper[4749]: I0320 08:00:58.294648 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbxzx" event={"ID":"7c06efc2-f581-474c-bd22-1b8f6d4625fa","Type":"ContainerStarted","Data":"23e46b614b0df364b2c0c244549266eedaf884da88821d79a0766ff49bbb75f5"} Mar 20 08:00:58 crc kubenswrapper[4749]: E0320 08:00:58.296452 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c06efc2_f581_474c_bd22_1b8f6d4625fa.slice/crio-conmon-5daf76e3735d3c9bd53bbc2fb1bff963e7ffdbfd1d30886442b04e4c6dd4cda5.scope\": RecentStats: unable to find data in memory cache]" Mar 20 08:00:59 crc kubenswrapper[4749]: I0320 08:00:59.178457 4749 scope.go:117] "RemoveContainer" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" Mar 20 08:00:59 crc kubenswrapper[4749]: E0320 08:00:59.179685 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:00:59 crc kubenswrapper[4749]: I0320 08:00:59.311200 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbxzx" event={"ID":"7c06efc2-f581-474c-bd22-1b8f6d4625fa","Type":"ContainerStarted","Data":"c9010f7f050f3eefbf83af0959edfe3e33ea1f217f4dbb230f034467ecf85b43"} Mar 20 08:01:00 crc kubenswrapper[4749]: I0320 08:01:00.329190 4749 generic.go:334] "Generic (PLEG): container finished" podID="7c06efc2-f581-474c-bd22-1b8f6d4625fa" containerID="c9010f7f050f3eefbf83af0959edfe3e33ea1f217f4dbb230f034467ecf85b43" exitCode=0 Mar 20 08:01:00 crc kubenswrapper[4749]: I0320 08:01:00.329440 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbxzx" event={"ID":"7c06efc2-f581-474c-bd22-1b8f6d4625fa","Type":"ContainerDied","Data":"c9010f7f050f3eefbf83af0959edfe3e33ea1f217f4dbb230f034467ecf85b43"} Mar 20 08:01:01 crc kubenswrapper[4749]: I0320 08:01:01.343546 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbxzx" event={"ID":"7c06efc2-f581-474c-bd22-1b8f6d4625fa","Type":"ContainerStarted","Data":"26cf4f009f40f8b1b7a7309902727aef9303fe82a1eb71586b197f826591dd4d"} Mar 20 08:01:01 crc kubenswrapper[4749]: I0320 08:01:01.372820 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bbxzx" podStartSLOduration=1.899961655 podStartE2EDuration="4.37278958s" podCreationTimestamp="2026-03-20 08:00:57 +0000 UTC" firstStartedPulling="2026-03-20 08:00:58.297154968 +0000 UTC m=+2894.846812615" lastFinishedPulling="2026-03-20 08:01:00.769982863 +0000 UTC m=+2897.319640540" observedRunningTime="2026-03-20 08:01:01.369840209 +0000 UTC m=+2897.919497856" watchObservedRunningTime="2026-03-20 08:01:01.37278958 +0000 UTC m=+2897.922447257" Mar 20 08:01:04 crc kubenswrapper[4749]: I0320 08:01:04.514900 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:01:04 crc kubenswrapper[4749]: I0320 08:01:04.515400 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:01:07 crc kubenswrapper[4749]: I0320 08:01:07.552969 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bbxzx" Mar 20 08:01:07 crc kubenswrapper[4749]: I0320 08:01:07.553350 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bbxzx" Mar 20 08:01:07 crc kubenswrapper[4749]: I0320 08:01:07.630687 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bbxzx" Mar 20 08:01:08 crc kubenswrapper[4749]: I0320 08:01:08.491099 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bbxzx" Mar 20 08:01:08 crc kubenswrapper[4749]: I0320 08:01:08.557649 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bbxzx"] Mar 20 08:01:10 crc kubenswrapper[4749]: I0320 08:01:10.430510 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bbxzx" podUID="7c06efc2-f581-474c-bd22-1b8f6d4625fa" containerName="registry-server" containerID="cri-o://26cf4f009f40f8b1b7a7309902727aef9303fe82a1eb71586b197f826591dd4d" gracePeriod=2 Mar 20 08:01:10 crc kubenswrapper[4749]: I0320 08:01:10.952811 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbxzx" Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.024479 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s44sd\" (UniqueName: \"kubernetes.io/projected/7c06efc2-f581-474c-bd22-1b8f6d4625fa-kube-api-access-s44sd\") pod \"7c06efc2-f581-474c-bd22-1b8f6d4625fa\" (UID: \"7c06efc2-f581-474c-bd22-1b8f6d4625fa\") " Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.024575 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c06efc2-f581-474c-bd22-1b8f6d4625fa-catalog-content\") pod \"7c06efc2-f581-474c-bd22-1b8f6d4625fa\" (UID: \"7c06efc2-f581-474c-bd22-1b8f6d4625fa\") " Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.024637 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c06efc2-f581-474c-bd22-1b8f6d4625fa-utilities\") pod \"7c06efc2-f581-474c-bd22-1b8f6d4625fa\" (UID: \"7c06efc2-f581-474c-bd22-1b8f6d4625fa\") " Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.025639 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c06efc2-f581-474c-bd22-1b8f6d4625fa-utilities" (OuterVolumeSpecName: "utilities") pod "7c06efc2-f581-474c-bd22-1b8f6d4625fa" (UID: "7c06efc2-f581-474c-bd22-1b8f6d4625fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.031389 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c06efc2-f581-474c-bd22-1b8f6d4625fa-kube-api-access-s44sd" (OuterVolumeSpecName: "kube-api-access-s44sd") pod "7c06efc2-f581-474c-bd22-1b8f6d4625fa" (UID: "7c06efc2-f581-474c-bd22-1b8f6d4625fa"). InnerVolumeSpecName "kube-api-access-s44sd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.080464 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c06efc2-f581-474c-bd22-1b8f6d4625fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c06efc2-f581-474c-bd22-1b8f6d4625fa" (UID: "7c06efc2-f581-474c-bd22-1b8f6d4625fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.126621 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s44sd\" (UniqueName: \"kubernetes.io/projected/7c06efc2-f581-474c-bd22-1b8f6d4625fa-kube-api-access-s44sd\") on node \"crc\" DevicePath \"\"" Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.126658 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c06efc2-f581-474c-bd22-1b8f6d4625fa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.126669 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c06efc2-f581-474c-bd22-1b8f6d4625fa-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.177115 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 08:01:11 crc kubenswrapper[4749]: E0320 08:01:11.177798 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.440334 4749 generic.go:334] "Generic (PLEG): container finished" podID="7c06efc2-f581-474c-bd22-1b8f6d4625fa" containerID="26cf4f009f40f8b1b7a7309902727aef9303fe82a1eb71586b197f826591dd4d" exitCode=0 Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.440379 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbxzx" event={"ID":"7c06efc2-f581-474c-bd22-1b8f6d4625fa","Type":"ContainerDied","Data":"26cf4f009f40f8b1b7a7309902727aef9303fe82a1eb71586b197f826591dd4d"} Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.440418 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbxzx" Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.440444 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbxzx" event={"ID":"7c06efc2-f581-474c-bd22-1b8f6d4625fa","Type":"ContainerDied","Data":"23e46b614b0df364b2c0c244549266eedaf884da88821d79a0766ff49bbb75f5"} Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.440471 4749 scope.go:117] "RemoveContainer" containerID="26cf4f009f40f8b1b7a7309902727aef9303fe82a1eb71586b197f826591dd4d" Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.461219 4749 scope.go:117] "RemoveContainer" containerID="c9010f7f050f3eefbf83af0959edfe3e33ea1f217f4dbb230f034467ecf85b43" Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.478175 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bbxzx"] Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.501958 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bbxzx"] Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.502079 4749 scope.go:117] "RemoveContainer" containerID="5daf76e3735d3c9bd53bbc2fb1bff963e7ffdbfd1d30886442b04e4c6dd4cda5" Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.541619 4749 scope.go:117] "RemoveContainer" containerID="26cf4f009f40f8b1b7a7309902727aef9303fe82a1eb71586b197f826591dd4d" Mar 20 08:01:11 crc kubenswrapper[4749]: E0320 08:01:11.542107 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26cf4f009f40f8b1b7a7309902727aef9303fe82a1eb71586b197f826591dd4d\": container with ID starting with 26cf4f009f40f8b1b7a7309902727aef9303fe82a1eb71586b197f826591dd4d not found: ID does not exist" containerID="26cf4f009f40f8b1b7a7309902727aef9303fe82a1eb71586b197f826591dd4d" Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.542167 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26cf4f009f40f8b1b7a7309902727aef9303fe82a1eb71586b197f826591dd4d"} err="failed to get container status \"26cf4f009f40f8b1b7a7309902727aef9303fe82a1eb71586b197f826591dd4d\": rpc error: code = NotFound desc = could not find container \"26cf4f009f40f8b1b7a7309902727aef9303fe82a1eb71586b197f826591dd4d\": container with ID starting with 26cf4f009f40f8b1b7a7309902727aef9303fe82a1eb71586b197f826591dd4d not found: ID does not exist" Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.542199 4749 scope.go:117] "RemoveContainer" containerID="c9010f7f050f3eefbf83af0959edfe3e33ea1f217f4dbb230f034467ecf85b43" Mar 20 08:01:11 crc kubenswrapper[4749]: E0320 08:01:11.542633 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9010f7f050f3eefbf83af0959edfe3e33ea1f217f4dbb230f034467ecf85b43\": container with ID starting with c9010f7f050f3eefbf83af0959edfe3e33ea1f217f4dbb230f034467ecf85b43 not found: ID does not exist" containerID="c9010f7f050f3eefbf83af0959edfe3e33ea1f217f4dbb230f034467ecf85b43" Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.542681 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9010f7f050f3eefbf83af0959edfe3e33ea1f217f4dbb230f034467ecf85b43"} err="failed to get container status \"c9010f7f050f3eefbf83af0959edfe3e33ea1f217f4dbb230f034467ecf85b43\": rpc error: code = NotFound desc = could not find container \"c9010f7f050f3eefbf83af0959edfe3e33ea1f217f4dbb230f034467ecf85b43\": container with ID starting with c9010f7f050f3eefbf83af0959edfe3e33ea1f217f4dbb230f034467ecf85b43 not found: ID does not exist" Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.542706 4749 scope.go:117] "RemoveContainer" containerID="5daf76e3735d3c9bd53bbc2fb1bff963e7ffdbfd1d30886442b04e4c6dd4cda5" Mar 20 08:01:11 crc kubenswrapper[4749]: E0320 08:01:11.542936 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5daf76e3735d3c9bd53bbc2fb1bff963e7ffdbfd1d30886442b04e4c6dd4cda5\": container with ID starting with 5daf76e3735d3c9bd53bbc2fb1bff963e7ffdbfd1d30886442b04e4c6dd4cda5 not found: ID does not exist" containerID="5daf76e3735d3c9bd53bbc2fb1bff963e7ffdbfd1d30886442b04e4c6dd4cda5" Mar 20 08:01:11 crc kubenswrapper[4749]: I0320 08:01:11.542963 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5daf76e3735d3c9bd53bbc2fb1bff963e7ffdbfd1d30886442b04e4c6dd4cda5"} err="failed to get container status \"5daf76e3735d3c9bd53bbc2fb1bff963e7ffdbfd1d30886442b04e4c6dd4cda5\": rpc error: code = NotFound desc = could not find container \"5daf76e3735d3c9bd53bbc2fb1bff963e7ffdbfd1d30886442b04e4c6dd4cda5\": container with ID starting with 5daf76e3735d3c9bd53bbc2fb1bff963e7ffdbfd1d30886442b04e4c6dd4cda5 not found: ID does not exist" Mar 20 08:01:12 crc kubenswrapper[4749]: I0320 08:01:12.188896 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c06efc2-f581-474c-bd22-1b8f6d4625fa" path="/var/lib/kubelet/pods/7c06efc2-f581-474c-bd22-1b8f6d4625fa/volumes" Mar 20 08:01:13 crc kubenswrapper[4749]: I0320 08:01:13.177006 4749 scope.go:117] "RemoveContainer" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" Mar 20 08:01:13 crc kubenswrapper[4749]: E0320 08:01:13.177355 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:01:23 crc kubenswrapper[4749]: I0320 08:01:23.176964 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 08:01:23 crc kubenswrapper[4749]: E0320 08:01:23.178144 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:01:25 crc kubenswrapper[4749]: I0320 08:01:25.178194 4749 scope.go:117] "RemoveContainer" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" Mar 20 08:01:25 crc kubenswrapper[4749]: E0320 08:01:25.178913 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:01:34 crc kubenswrapper[4749]: I0320 08:01:34.515131 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:01:34 crc kubenswrapper[4749]: I0320 08:01:34.516082 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:01:34 crc kubenswrapper[4749]: I0320 08:01:34.516160 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" Mar 20 08:01:34 crc kubenswrapper[4749]: I0320 08:01:34.517369 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382"} pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 08:01:34 crc kubenswrapper[4749]: I0320 08:01:34.517550 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" containerID="cri-o://ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" gracePeriod=600 Mar 20 08:01:34 crc kubenswrapper[4749]: E0320 08:01:34.650389 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:01:34 crc kubenswrapper[4749]: I0320 08:01:34.673984 4749 generic.go:334] "Generic (PLEG): container finished" podID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" exitCode=0 Mar 20 08:01:34 crc kubenswrapper[4749]: I0320 08:01:34.674031 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerDied","Data":"ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382"} Mar 20 08:01:34 crc kubenswrapper[4749]: I0320 08:01:34.674065 4749 scope.go:117] "RemoveContainer" containerID="5b4585b69865061d7b06cc9fea1a7b04c408f0d820173decb8cf2fe0fabf3eda" Mar 20 08:01:34 crc kubenswrapper[4749]: I0320 08:01:34.675010 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:01:34 crc kubenswrapper[4749]: E0320 08:01:34.676200 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:01:35 crc kubenswrapper[4749]: I0320 08:01:35.177679 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 08:01:35 crc kubenswrapper[4749]: E0320 08:01:35.177909 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:01:39 crc kubenswrapper[4749]: I0320 08:01:39.177432 4749 scope.go:117] "RemoveContainer" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" Mar 20 08:01:39 crc kubenswrapper[4749]: E0320 08:01:39.177940 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:01:47 crc kubenswrapper[4749]: I0320 08:01:47.177445 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:01:47 crc kubenswrapper[4749]: E0320 08:01:47.178500 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:01:49 crc kubenswrapper[4749]: I0320 08:01:49.177369 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 08:01:49 crc kubenswrapper[4749]: E0320 08:01:49.178189 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:01:51 crc kubenswrapper[4749]: I0320 08:01:51.177431 4749 scope.go:117] "RemoveContainer" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" Mar 20 08:01:51 crc kubenswrapper[4749]: E0320 08:01:51.177891 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:02:00 crc kubenswrapper[4749]: I0320 08:02:00.172377 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566562-gnlpf"] Mar 20 08:02:00 crc kubenswrapper[4749]: E0320 08:02:00.174156 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c06efc2-f581-474c-bd22-1b8f6d4625fa" containerName="extract-utilities" Mar 20 08:02:00 crc kubenswrapper[4749]: I0320 08:02:00.174192 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c06efc2-f581-474c-bd22-1b8f6d4625fa" containerName="extract-utilities" Mar 20 08:02:00 crc kubenswrapper[4749]: E0320 08:02:00.174234 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c06efc2-f581-474c-bd22-1b8f6d4625fa" containerName="extract-content" Mar 20 08:02:00 crc kubenswrapper[4749]: I0320 08:02:00.174251 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c06efc2-f581-474c-bd22-1b8f6d4625fa" containerName="extract-content" Mar 20 08:02:00 crc kubenswrapper[4749]: E0320 08:02:00.174328 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c06efc2-f581-474c-bd22-1b8f6d4625fa" containerName="registry-server" Mar 20 08:02:00 crc kubenswrapper[4749]: I0320 08:02:00.174351 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c06efc2-f581-474c-bd22-1b8f6d4625fa" containerName="registry-server" Mar 20 08:02:00 crc kubenswrapper[4749]: I0320 08:02:00.174807 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c06efc2-f581-474c-bd22-1b8f6d4625fa" containerName="registry-server" Mar 20 08:02:00 crc kubenswrapper[4749]: I0320 08:02:00.175935 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566562-gnlpf" Mar 20 08:02:00 crc kubenswrapper[4749]: I0320 08:02:00.178202 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:02:00 crc kubenswrapper[4749]: E0320 08:02:00.178728 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:02:00 crc kubenswrapper[4749]: I0320 08:02:00.181224 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:02:00 crc kubenswrapper[4749]: I0320 08:02:00.181481 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:02:00 crc kubenswrapper[4749]: I0320 08:02:00.183968 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 08:02:00 crc kubenswrapper[4749]: I0320 08:02:00.190603 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566562-gnlpf"] Mar 20 08:02:00 crc kubenswrapper[4749]: I0320 08:02:00.280599 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhfhd\" (UniqueName: \"kubernetes.io/projected/4f4d5f3a-39d8-403e-9dfc-6155204cadf0-kube-api-access-nhfhd\") pod \"auto-csr-approver-29566562-gnlpf\" (UID: \"4f4d5f3a-39d8-403e-9dfc-6155204cadf0\") " pod="openshift-infra/auto-csr-approver-29566562-gnlpf" Mar 20 08:02:00 crc kubenswrapper[4749]: I0320 08:02:00.382644 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhfhd\" (UniqueName: \"kubernetes.io/projected/4f4d5f3a-39d8-403e-9dfc-6155204cadf0-kube-api-access-nhfhd\") pod \"auto-csr-approver-29566562-gnlpf\" (UID: \"4f4d5f3a-39d8-403e-9dfc-6155204cadf0\") " pod="openshift-infra/auto-csr-approver-29566562-gnlpf" Mar 20 08:02:00 crc kubenswrapper[4749]: I0320 08:02:00.421459 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhfhd\" (UniqueName: \"kubernetes.io/projected/4f4d5f3a-39d8-403e-9dfc-6155204cadf0-kube-api-access-nhfhd\") pod \"auto-csr-approver-29566562-gnlpf\" (UID: \"4f4d5f3a-39d8-403e-9dfc-6155204cadf0\") " pod="openshift-infra/auto-csr-approver-29566562-gnlpf" Mar 20 08:02:00 crc kubenswrapper[4749]: I0320 08:02:00.516178 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566562-gnlpf" Mar 20 08:02:01 crc kubenswrapper[4749]: I0320 08:02:01.035394 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566562-gnlpf"] Mar 20 08:02:01 crc kubenswrapper[4749]: I0320 08:02:01.037115 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 08:02:01 crc kubenswrapper[4749]: I0320 08:02:01.970924 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566562-gnlpf" event={"ID":"4f4d5f3a-39d8-403e-9dfc-6155204cadf0","Type":"ContainerStarted","Data":"9c20962f91692da2c65d4c0f868ea5b7d7495285ed82c7a15cacb6e77daa9181"} Mar 20 08:02:02 crc kubenswrapper[4749]: I0320 08:02:02.985373 4749 generic.go:334] "Generic (PLEG): container finished" podID="4f4d5f3a-39d8-403e-9dfc-6155204cadf0" containerID="d91dfc559d5090b384a320f468e9284817d13acb128fdbcccfe3aee2cf549b97" exitCode=0 Mar 20 08:02:02 crc kubenswrapper[4749]: I0320 08:02:02.985487 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566562-gnlpf" event={"ID":"4f4d5f3a-39d8-403e-9dfc-6155204cadf0","Type":"ContainerDied","Data":"d91dfc559d5090b384a320f468e9284817d13acb128fdbcccfe3aee2cf549b97"} Mar 20 08:02:04 crc kubenswrapper[4749]: I0320 08:02:04.179409 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 08:02:04 crc kubenswrapper[4749]: E0320 08:02:04.179959 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:02:04 crc kubenswrapper[4749]: I0320 08:02:04.374766 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566562-gnlpf" Mar 20 08:02:04 crc kubenswrapper[4749]: I0320 08:02:04.463670 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhfhd\" (UniqueName: \"kubernetes.io/projected/4f4d5f3a-39d8-403e-9dfc-6155204cadf0-kube-api-access-nhfhd\") pod \"4f4d5f3a-39d8-403e-9dfc-6155204cadf0\" (UID: \"4f4d5f3a-39d8-403e-9dfc-6155204cadf0\") " Mar 20 08:02:04 crc kubenswrapper[4749]: I0320 08:02:04.470771 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f4d5f3a-39d8-403e-9dfc-6155204cadf0-kube-api-access-nhfhd" (OuterVolumeSpecName: "kube-api-access-nhfhd") pod "4f4d5f3a-39d8-403e-9dfc-6155204cadf0" (UID: "4f4d5f3a-39d8-403e-9dfc-6155204cadf0"). InnerVolumeSpecName "kube-api-access-nhfhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:02:04 crc kubenswrapper[4749]: I0320 08:02:04.565519 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhfhd\" (UniqueName: \"kubernetes.io/projected/4f4d5f3a-39d8-403e-9dfc-6155204cadf0-kube-api-access-nhfhd\") on node \"crc\" DevicePath \"\"" Mar 20 08:02:05 crc kubenswrapper[4749]: I0320 08:02:05.005553 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566562-gnlpf" event={"ID":"4f4d5f3a-39d8-403e-9dfc-6155204cadf0","Type":"ContainerDied","Data":"9c20962f91692da2c65d4c0f868ea5b7d7495285ed82c7a15cacb6e77daa9181"} Mar 20 08:02:05 crc kubenswrapper[4749]: I0320 08:02:05.005589 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c20962f91692da2c65d4c0f868ea5b7d7495285ed82c7a15cacb6e77daa9181" Mar 20 08:02:05 crc kubenswrapper[4749]: I0320 08:02:05.005922 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566562-gnlpf" Mar 20 08:02:05 crc kubenswrapper[4749]: I0320 08:02:05.463436 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566556-42bpd"] Mar 20 08:02:05 crc kubenswrapper[4749]: I0320 08:02:05.475122 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566556-42bpd"] Mar 20 08:02:06 crc kubenswrapper[4749]: I0320 08:02:06.177617 4749 scope.go:117] "RemoveContainer" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" Mar 20 08:02:06 crc kubenswrapper[4749]: E0320 08:02:06.177979 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:02:06 crc kubenswrapper[4749]: I0320 08:02:06.191051 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e17fd93-fb61-4e2f-86c5-cfafb328c79c" path="/var/lib/kubelet/pods/8e17fd93-fb61-4e2f-86c5-cfafb328c79c/volumes" Mar 20 08:02:10 crc kubenswrapper[4749]: I0320 08:02:10.796049 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h9vbt/must-gather-5qg26"] Mar 20 08:02:10 crc kubenswrapper[4749]: E0320 08:02:10.797068 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f4d5f3a-39d8-403e-9dfc-6155204cadf0" containerName="oc" Mar 20 08:02:10 crc kubenswrapper[4749]: I0320 08:02:10.797087 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f4d5f3a-39d8-403e-9dfc-6155204cadf0" containerName="oc" Mar 20 08:02:10 crc kubenswrapper[4749]: I0320 08:02:10.797523 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f4d5f3a-39d8-403e-9dfc-6155204cadf0" containerName="oc" Mar 20 08:02:10 crc kubenswrapper[4749]: I0320 08:02:10.800785 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9vbt/must-gather-5qg26" Mar 20 08:02:10 crc kubenswrapper[4749]: I0320 08:02:10.805007 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-h9vbt"/"kube-root-ca.crt" Mar 20 08:02:10 crc kubenswrapper[4749]: I0320 08:02:10.808686 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-h9vbt"/"openshift-service-ca.crt" Mar 20 08:02:10 crc kubenswrapper[4749]: I0320 08:02:10.817650 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h9vbt/must-gather-5qg26"] Mar 20 08:02:10 crc kubenswrapper[4749]: I0320 08:02:10.999411 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br9vk\" (UniqueName: \"kubernetes.io/projected/15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf-kube-api-access-br9vk\") pod \"must-gather-5qg26\" (UID: \"15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf\") " pod="openshift-must-gather-h9vbt/must-gather-5qg26" Mar 20 08:02:10 crc kubenswrapper[4749]: I0320 08:02:10.999491 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf-must-gather-output\") pod \"must-gather-5qg26\" (UID: \"15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf\") " pod="openshift-must-gather-h9vbt/must-gather-5qg26" Mar 20 08:02:11 crc kubenswrapper[4749]: I0320 08:02:11.101504 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br9vk\" (UniqueName: \"kubernetes.io/projected/15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf-kube-api-access-br9vk\") pod \"must-gather-5qg26\" (UID: \"15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf\") " pod="openshift-must-gather-h9vbt/must-gather-5qg26" Mar 20 08:02:11 crc kubenswrapper[4749]: I0320 08:02:11.101586 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf-must-gather-output\") pod \"must-gather-5qg26\" (UID: \"15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf\") " pod="openshift-must-gather-h9vbt/must-gather-5qg26" Mar 20 08:02:11 crc kubenswrapper[4749]: I0320 08:02:11.102340 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf-must-gather-output\") pod \"must-gather-5qg26\" (UID: \"15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf\") " pod="openshift-must-gather-h9vbt/must-gather-5qg26" Mar 20 08:02:11 crc kubenswrapper[4749]: I0320 08:02:11.122010 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br9vk\" (UniqueName: \"kubernetes.io/projected/15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf-kube-api-access-br9vk\") pod \"must-gather-5qg26\" (UID: \"15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf\") " pod="openshift-must-gather-h9vbt/must-gather-5qg26" Mar 20 08:02:11 crc kubenswrapper[4749]: I0320 08:02:11.420632 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9vbt/must-gather-5qg26" Mar 20 08:02:11 crc kubenswrapper[4749]: I0320 08:02:11.894478 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h9vbt/must-gather-5qg26"] Mar 20 08:02:12 crc kubenswrapper[4749]: I0320 08:02:12.069354 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9vbt/must-gather-5qg26" event={"ID":"15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf","Type":"ContainerStarted","Data":"03e9e599ffcf1127b748f63ba43b277c87916bd5e57ab4901149cad7045770d4"} Mar 20 08:02:14 crc kubenswrapper[4749]: I0320 08:02:14.187802 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:02:14 crc kubenswrapper[4749]: E0320 08:02:14.188309 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:02:17 crc kubenswrapper[4749]: I0320 08:02:17.177012 4749 scope.go:117] "RemoveContainer" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" Mar 20 08:02:17 crc kubenswrapper[4749]: E0320 08:02:17.177468 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:02:18 crc kubenswrapper[4749]: I0320 08:02:18.145573 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9vbt/must-gather-5qg26" event={"ID":"15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf","Type":"ContainerStarted","Data":"a2730979fad73f1bc2efb5785221f9f32dacbfbb44fbf44ddbad8d07d8d7de51"} Mar 20 08:02:18 crc kubenswrapper[4749]: I0320 08:02:18.864375 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h9vbt/crc-debug-lt9lh"] Mar 20 08:02:18 crc kubenswrapper[4749]: I0320 08:02:18.866369 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9vbt/crc-debug-lt9lh" Mar 20 08:02:18 crc kubenswrapper[4749]: I0320 08:02:18.868935 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-h9vbt"/"default-dockercfg-mh5dc" Mar 20 08:02:18 crc kubenswrapper[4749]: I0320 08:02:18.954886 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df413785-7b91-47f6-9543-2caa859478e6-host\") pod \"crc-debug-lt9lh\" (UID: \"df413785-7b91-47f6-9543-2caa859478e6\") " pod="openshift-must-gather-h9vbt/crc-debug-lt9lh" Mar 20 08:02:18 crc kubenswrapper[4749]: I0320 08:02:18.954956 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-664kb\" (UniqueName: \"kubernetes.io/projected/df413785-7b91-47f6-9543-2caa859478e6-kube-api-access-664kb\") pod \"crc-debug-lt9lh\" (UID: \"df413785-7b91-47f6-9543-2caa859478e6\") " pod="openshift-must-gather-h9vbt/crc-debug-lt9lh" Mar 20 08:02:19 crc kubenswrapper[4749]: I0320 08:02:19.056360 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df413785-7b91-47f6-9543-2caa859478e6-host\") pod \"crc-debug-lt9lh\" (UID: \"df413785-7b91-47f6-9543-2caa859478e6\") " pod="openshift-must-gather-h9vbt/crc-debug-lt9lh" Mar 20 08:02:19 crc kubenswrapper[4749]: I0320 08:02:19.056712 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-664kb\" (UniqueName: \"kubernetes.io/projected/df413785-7b91-47f6-9543-2caa859478e6-kube-api-access-664kb\") pod \"crc-debug-lt9lh\" (UID: \"df413785-7b91-47f6-9543-2caa859478e6\") " pod="openshift-must-gather-h9vbt/crc-debug-lt9lh" Mar 20 08:02:19 crc kubenswrapper[4749]: I0320 08:02:19.056523 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df413785-7b91-47f6-9543-2caa859478e6-host\") pod \"crc-debug-lt9lh\" (UID: \"df413785-7b91-47f6-9543-2caa859478e6\") " pod="openshift-must-gather-h9vbt/crc-debug-lt9lh" Mar 20 08:02:19 crc kubenswrapper[4749]: I0320 08:02:19.082193 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-664kb\" (UniqueName: \"kubernetes.io/projected/df413785-7b91-47f6-9543-2caa859478e6-kube-api-access-664kb\") pod \"crc-debug-lt9lh\" (UID: \"df413785-7b91-47f6-9543-2caa859478e6\") " pod="openshift-must-gather-h9vbt/crc-debug-lt9lh" Mar 20 08:02:19 crc kubenswrapper[4749]: I0320 08:02:19.153901 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9vbt/must-gather-5qg26" event={"ID":"15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf","Type":"ContainerStarted","Data":"e6158cb2d59044f0ab0a1c77b1970922ce4ef675a575bc19c8cbbbbccf00ce8b"} Mar 20 08:02:19 crc kubenswrapper[4749]: I0320 08:02:19.175587 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h9vbt/must-gather-5qg26" podStartSLOduration=3.297500559 podStartE2EDuration="9.175566679s" podCreationTimestamp="2026-03-20 08:02:10 +0000 UTC" firstStartedPulling="2026-03-20 08:02:11.88403439 +0000 UTC m=+2968.433692047" lastFinishedPulling="2026-03-20 08:02:17.76210052 +0000 UTC m=+2974.311758167" observedRunningTime="2026-03-20 08:02:19.171740627 +0000 UTC m=+2975.721398274" watchObservedRunningTime="2026-03-20 08:02:19.175566679 +0000 UTC m=+2975.725224336" Mar 20 08:02:19 crc kubenswrapper[4749]: I0320 08:02:19.177462 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 08:02:19 crc kubenswrapper[4749]: E0320 08:02:19.177852 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:02:19 crc kubenswrapper[4749]: I0320 08:02:19.181709 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9vbt/crc-debug-lt9lh" Mar 20 08:02:19 crc kubenswrapper[4749]: W0320 08:02:19.207639 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf413785_7b91_47f6_9543_2caa859478e6.slice/crio-23b1b2a23114980c184930679bba0794829058f0b844e38f138519f5807f2e45 WatchSource:0}: Error finding container 23b1b2a23114980c184930679bba0794829058f0b844e38f138519f5807f2e45: Status 404 returned error can't find the container with id 23b1b2a23114980c184930679bba0794829058f0b844e38f138519f5807f2e45 Mar 20 08:02:20 crc kubenswrapper[4749]: I0320 08:02:20.162538 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9vbt/crc-debug-lt9lh" event={"ID":"df413785-7b91-47f6-9543-2caa859478e6","Type":"ContainerStarted","Data":"23b1b2a23114980c184930679bba0794829058f0b844e38f138519f5807f2e45"} Mar 20 08:02:25 crc kubenswrapper[4749]: I0320 08:02:25.177312 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:02:25 crc kubenswrapper[4749]: E0320 08:02:25.178388 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:02:31 crc kubenswrapper[4749]: I0320 08:02:31.177392 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 08:02:31 crc kubenswrapper[4749]: E0320 08:02:31.178042 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:02:31 crc kubenswrapper[4749]: I0320 08:02:31.235454 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9vbt/crc-debug-lt9lh" event={"ID":"df413785-7b91-47f6-9543-2caa859478e6","Type":"ContainerStarted","Data":"e2ac2d8da547e6d715111e827e7262be5e861add8fd2e0469d06f384e768d5b9"} Mar 20 08:02:31 crc kubenswrapper[4749]: I0320 08:02:31.259908 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h9vbt/crc-debug-lt9lh" podStartSLOduration=2.415823373 podStartE2EDuration="13.259882659s" podCreationTimestamp="2026-03-20 08:02:18 +0000 UTC" firstStartedPulling="2026-03-20 08:02:19.210519757 +0000 UTC m=+2975.760177444" lastFinishedPulling="2026-03-20 08:02:30.054579083 +0000 UTC m=+2986.604236730" observedRunningTime="2026-03-20 08:02:31.249020428 +0000 UTC m=+2987.798678085" watchObservedRunningTime="2026-03-20 08:02:31.259882659 +0000 UTC m=+2987.809540346" Mar 20 08:02:32 crc kubenswrapper[4749]: I0320 08:02:32.177137 4749 scope.go:117] "RemoveContainer" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" Mar 20 08:02:32 crc kubenswrapper[4749]: E0320 08:02:32.177777 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:02:37 crc kubenswrapper[4749]: I0320 08:02:37.177977 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:02:37 crc kubenswrapper[4749]: E0320 08:02:37.178913 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:02:43 crc kubenswrapper[4749]: I0320 08:02:43.177559 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 08:02:43 crc kubenswrapper[4749]: E0320 08:02:43.178422 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:02:47 crc kubenswrapper[4749]: I0320 08:02:47.178123 4749 scope.go:117] "RemoveContainer" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" Mar 20 08:02:47 crc kubenswrapper[4749]: E0320 08:02:47.179096 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:02:48 crc kubenswrapper[4749]: I0320 08:02:48.378812 4749 generic.go:334] "Generic (PLEG): container finished" podID="df413785-7b91-47f6-9543-2caa859478e6" containerID="e2ac2d8da547e6d715111e827e7262be5e861add8fd2e0469d06f384e768d5b9" exitCode=0 Mar 20 08:02:48 crc kubenswrapper[4749]: I0320 08:02:48.378889 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9vbt/crc-debug-lt9lh" event={"ID":"df413785-7b91-47f6-9543-2caa859478e6","Type":"ContainerDied","Data":"e2ac2d8da547e6d715111e827e7262be5e861add8fd2e0469d06f384e768d5b9"} Mar 20 08:02:49 crc kubenswrapper[4749]: I0320 08:02:49.177698 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:02:49 crc kubenswrapper[4749]: E0320 08:02:49.178587 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:02:49 crc kubenswrapper[4749]: I0320 08:02:49.489789 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9vbt/crc-debug-lt9lh" Mar 20 08:02:49 crc kubenswrapper[4749]: I0320 08:02:49.528650 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-h9vbt/crc-debug-lt9lh"] Mar 20 08:02:49 crc kubenswrapper[4749]: I0320 08:02:49.536706 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-h9vbt/crc-debug-lt9lh"] Mar 20 08:02:49 crc kubenswrapper[4749]: I0320 08:02:49.555009 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-664kb\" (UniqueName: \"kubernetes.io/projected/df413785-7b91-47f6-9543-2caa859478e6-kube-api-access-664kb\") pod \"df413785-7b91-47f6-9543-2caa859478e6\" (UID: \"df413785-7b91-47f6-9543-2caa859478e6\") " Mar 20 08:02:49 crc kubenswrapper[4749]: I0320 08:02:49.555225 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df413785-7b91-47f6-9543-2caa859478e6-host\") pod \"df413785-7b91-47f6-9543-2caa859478e6\" (UID: \"df413785-7b91-47f6-9543-2caa859478e6\") " Mar 20 08:02:49 crc kubenswrapper[4749]: I0320 08:02:49.555357 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df413785-7b91-47f6-9543-2caa859478e6-host" (OuterVolumeSpecName: "host") pod "df413785-7b91-47f6-9543-2caa859478e6" (UID: "df413785-7b91-47f6-9543-2caa859478e6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:02:49 crc kubenswrapper[4749]: I0320 08:02:49.555678 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df413785-7b91-47f6-9543-2caa859478e6-host\") on node \"crc\" DevicePath \"\"" Mar 20 08:02:49 crc kubenswrapper[4749]: I0320 08:02:49.562679 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df413785-7b91-47f6-9543-2caa859478e6-kube-api-access-664kb" (OuterVolumeSpecName: "kube-api-access-664kb") pod "df413785-7b91-47f6-9543-2caa859478e6" (UID: "df413785-7b91-47f6-9543-2caa859478e6"). InnerVolumeSpecName "kube-api-access-664kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:02:49 crc kubenswrapper[4749]: I0320 08:02:49.657572 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-664kb\" (UniqueName: \"kubernetes.io/projected/df413785-7b91-47f6-9543-2caa859478e6-kube-api-access-664kb\") on node \"crc\" DevicePath \"\"" Mar 20 08:02:50 crc kubenswrapper[4749]: I0320 08:02:50.190760 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df413785-7b91-47f6-9543-2caa859478e6" path="/var/lib/kubelet/pods/df413785-7b91-47f6-9543-2caa859478e6/volumes" Mar 20 08:02:50 crc kubenswrapper[4749]: I0320 08:02:50.397710 4749 scope.go:117] "RemoveContainer" containerID="e2ac2d8da547e6d715111e827e7262be5e861add8fd2e0469d06f384e768d5b9" Mar 20 08:02:50 crc kubenswrapper[4749]: I0320 08:02:50.397904 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9vbt/crc-debug-lt9lh" Mar 20 08:02:50 crc kubenswrapper[4749]: I0320 08:02:50.735808 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h9vbt/crc-debug-z8whn"] Mar 20 08:02:50 crc kubenswrapper[4749]: E0320 08:02:50.736394 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df413785-7b91-47f6-9543-2caa859478e6" containerName="container-00" Mar 20 08:02:50 crc kubenswrapper[4749]: I0320 08:02:50.736866 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="df413785-7b91-47f6-9543-2caa859478e6" containerName="container-00" Mar 20 08:02:50 crc kubenswrapper[4749]: I0320 08:02:50.737145 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="df413785-7b91-47f6-9543-2caa859478e6" containerName="container-00" Mar 20 08:02:50 crc kubenswrapper[4749]: I0320 08:02:50.738072 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9vbt/crc-debug-z8whn" Mar 20 08:02:50 crc kubenswrapper[4749]: I0320 08:02:50.739789 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-h9vbt"/"default-dockercfg-mh5dc" Mar 20 08:02:50 crc kubenswrapper[4749]: I0320 08:02:50.880479 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d-host\") pod \"crc-debug-z8whn\" (UID: \"0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d\") " pod="openshift-must-gather-h9vbt/crc-debug-z8whn" Mar 20 08:02:50 crc kubenswrapper[4749]: I0320 08:02:50.880979 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj9t5\" (UniqueName: \"kubernetes.io/projected/0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d-kube-api-access-mj9t5\") pod \"crc-debug-z8whn\" (UID: \"0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d\") " pod="openshift-must-gather-h9vbt/crc-debug-z8whn" Mar 20 08:02:50 crc kubenswrapper[4749]: I0320 08:02:50.982896 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj9t5\" (UniqueName: \"kubernetes.io/projected/0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d-kube-api-access-mj9t5\") pod \"crc-debug-z8whn\" (UID: \"0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d\") " pod="openshift-must-gather-h9vbt/crc-debug-z8whn" Mar 20 08:02:50 crc kubenswrapper[4749]: I0320 08:02:50.983025 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d-host\") pod \"crc-debug-z8whn\" (UID: \"0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d\") " pod="openshift-must-gather-h9vbt/crc-debug-z8whn" Mar 20 08:02:50 crc kubenswrapper[4749]: I0320 08:02:50.983144 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d-host\") pod \"crc-debug-z8whn\" (UID: \"0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d\") " pod="openshift-must-gather-h9vbt/crc-debug-z8whn" Mar 20 08:02:51 crc kubenswrapper[4749]: I0320 08:02:51.030346 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj9t5\" (UniqueName: \"kubernetes.io/projected/0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d-kube-api-access-mj9t5\") pod \"crc-debug-z8whn\" (UID: \"0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d\") " pod="openshift-must-gather-h9vbt/crc-debug-z8whn" Mar 20 08:02:51 crc kubenswrapper[4749]: I0320 08:02:51.062192 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9vbt/crc-debug-z8whn" Mar 20 08:02:51 crc kubenswrapper[4749]: W0320 08:02:51.098656 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fd7b424_f9e0_4da9_bcbb_5b522a6e5c6d.slice/crio-7ecb2bde7d11b7db0a7134e34352067effe2cbf0e66d8f0b226baf8e5d98c934 WatchSource:0}: Error finding container 7ecb2bde7d11b7db0a7134e34352067effe2cbf0e66d8f0b226baf8e5d98c934: Status 404 returned error can't find the container with id 7ecb2bde7d11b7db0a7134e34352067effe2cbf0e66d8f0b226baf8e5d98c934 Mar 20 08:02:51 crc kubenswrapper[4749]: I0320 08:02:51.410179 4749 generic.go:334] "Generic (PLEG): container finished" podID="0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d" containerID="33d8c7bcc5b1d378048891d020353a77c71d931869cb70769d1f3d611540df1c" exitCode=1 Mar 20 08:02:51 crc kubenswrapper[4749]: I0320 08:02:51.410220 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9vbt/crc-debug-z8whn" event={"ID":"0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d","Type":"ContainerDied","Data":"33d8c7bcc5b1d378048891d020353a77c71d931869cb70769d1f3d611540df1c"} Mar 20 08:02:51 crc kubenswrapper[4749]: I0320 08:02:51.410550 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9vbt/crc-debug-z8whn" event={"ID":"0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d","Type":"ContainerStarted","Data":"7ecb2bde7d11b7db0a7134e34352067effe2cbf0e66d8f0b226baf8e5d98c934"} Mar 20 08:02:51 crc kubenswrapper[4749]: I0320 08:02:51.446572 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-h9vbt/crc-debug-z8whn"] Mar 20 08:02:51 crc kubenswrapper[4749]: I0320 08:02:51.453833 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-h9vbt/crc-debug-z8whn"] Mar 20 08:02:52 crc kubenswrapper[4749]: I0320 08:02:52.503789 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9vbt/crc-debug-z8whn" Mar 20 08:02:52 crc kubenswrapper[4749]: I0320 08:02:52.609917 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj9t5\" (UniqueName: \"kubernetes.io/projected/0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d-kube-api-access-mj9t5\") pod \"0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d\" (UID: \"0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d\") " Mar 20 08:02:52 crc kubenswrapper[4749]: I0320 08:02:52.609968 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d-host\") pod \"0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d\" (UID: \"0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d\") " Mar 20 08:02:52 crc kubenswrapper[4749]: I0320 08:02:52.610111 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d-host" (OuterVolumeSpecName: "host") pod "0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d" (UID: "0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:02:52 crc kubenswrapper[4749]: I0320 08:02:52.610523 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d-host\") on node \"crc\" DevicePath \"\"" Mar 20 08:02:52 crc kubenswrapper[4749]: I0320 08:02:52.619269 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d-kube-api-access-mj9t5" (OuterVolumeSpecName: "kube-api-access-mj9t5") pod "0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d" (UID: "0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d"). InnerVolumeSpecName "kube-api-access-mj9t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:02:52 crc kubenswrapper[4749]: I0320 08:02:52.712071 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj9t5\" (UniqueName: \"kubernetes.io/projected/0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d-kube-api-access-mj9t5\") on node \"crc\" DevicePath \"\"" Mar 20 08:02:53 crc kubenswrapper[4749]: I0320 08:02:53.433037 4749 scope.go:117] "RemoveContainer" containerID="33d8c7bcc5b1d378048891d020353a77c71d931869cb70769d1f3d611540df1c" Mar 20 08:02:53 crc kubenswrapper[4749]: I0320 08:02:53.433065 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9vbt/crc-debug-z8whn" Mar 20 08:02:54 crc kubenswrapper[4749]: I0320 08:02:54.186294 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d" path="/var/lib/kubelet/pods/0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d/volumes" Mar 20 08:02:56 crc kubenswrapper[4749]: I0320 08:02:56.182711 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 08:02:56 crc kubenswrapper[4749]: E0320 08:02:56.183520 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:02:56 crc kubenswrapper[4749]: I0320 08:02:56.788925 4749 scope.go:117] "RemoveContainer" containerID="9704d59f05eae35673fe373c8ee22f03b0c4a5aaed0685139fe32f447db41b06" Mar 20 08:03:00 crc kubenswrapper[4749]: I0320 08:03:00.179168 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:03:00 crc kubenswrapper[4749]: E0320 08:03:00.180174 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:03:01 crc kubenswrapper[4749]: I0320 08:03:01.177915 4749 scope.go:117] "RemoveContainer" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" Mar 20 08:03:01 crc kubenswrapper[4749]: E0320 08:03:01.178567 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:03:07 crc kubenswrapper[4749]: I0320 08:03:07.177351 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 08:03:07 crc kubenswrapper[4749]: E0320 08:03:07.178119 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:03:11 crc kubenswrapper[4749]: I0320 08:03:11.215428 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74f6bcbc87-l6jpr_98b88bbe-3668-4194-840d-1ba64dd6c32e/init/0.log" Mar 20 08:03:11 crc kubenswrapper[4749]: I0320 08:03:11.388852 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74f6bcbc87-l6jpr_98b88bbe-3668-4194-840d-1ba64dd6c32e/init/0.log" Mar 20 08:03:11 crc kubenswrapper[4749]: I0320 08:03:11.412799 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74f6bcbc87-l6jpr_98b88bbe-3668-4194-840d-1ba64dd6c32e/dnsmasq-dns/0.log" Mar 20 08:03:11 crc kubenswrapper[4749]: I0320 08:03:11.541574 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_b364d204-dcba-4b43-98e1-f1e22bd89b2c/kube-state-metrics/0.log" Mar 20 08:03:11 crc kubenswrapper[4749]: I0320 08:03:11.678157 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_e8fd1b2c-620b-44a6-b4f0-1c4d2cbda056/memcached/0.log" Mar 20 08:03:11 crc kubenswrapper[4749]: I0320 08:03:11.753103 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8052bc33-6f6a-437e-9df5-508256f7e32f/mysql-bootstrap/0.log" Mar 20 08:03:11 crc kubenswrapper[4749]: I0320 08:03:11.879824 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8052bc33-6f6a-437e-9df5-508256f7e32f/mysql-bootstrap/0.log" Mar 20 08:03:11 crc kubenswrapper[4749]: I0320 08:03:11.909690 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8052bc33-6f6a-437e-9df5-508256f7e32f/galera/0.log" Mar 20 08:03:11 crc kubenswrapper[4749]: I0320 08:03:11.942445 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_1c96afef-fa85-45f2-89cd-2fb2db26b9f8/mysql-bootstrap/0.log" Mar 20 08:03:12 crc kubenswrapper[4749]: I0320 08:03:12.129495 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_1c96afef-fa85-45f2-89cd-2fb2db26b9f8/mysql-bootstrap/0.log" Mar 20 08:03:12 crc kubenswrapper[4749]: I0320 08:03:12.130266 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_1c96afef-fa85-45f2-89cd-2fb2db26b9f8/galera/0.log" Mar 20 08:03:12 crc kubenswrapper[4749]: I0320 08:03:12.145707 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-7xcsl_fe444947-6938-4000-8de5-462c8d0a42aa/openstack-network-exporter/0.log" Mar 20 08:03:12 crc kubenswrapper[4749]: I0320 08:03:12.178169 4749 scope.go:117] "RemoveContainer" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" Mar 20 08:03:12 crc kubenswrapper[4749]: E0320 08:03:12.178425 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:03:12 crc kubenswrapper[4749]: I0320 08:03:12.302637 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kvqdd_d72e69d0-23f3-4d14-ab35-74ea19e79b69/ovsdb-server-init/0.log" Mar 20 08:03:12 crc kubenswrapper[4749]: I0320 08:03:12.490204 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kvqdd_d72e69d0-23f3-4d14-ab35-74ea19e79b69/ovs-vswitchd/0.log" Mar 20 08:03:12 crc kubenswrapper[4749]: I0320 08:03:12.492777 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kvqdd_d72e69d0-23f3-4d14-ab35-74ea19e79b69/ovsdb-server-init/0.log" Mar 20 08:03:12 crc kubenswrapper[4749]: I0320 08:03:12.495060 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kvqdd_d72e69d0-23f3-4d14-ab35-74ea19e79b69/ovsdb-server/0.log" Mar 20 08:03:12 crc kubenswrapper[4749]: I0320 08:03:12.680702 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-tx9bw_32ceaa95-18d9-4f1e-9ebd-f2d413709413/ovn-controller/0.log" Mar 20 08:03:12 crc kubenswrapper[4749]: I0320 08:03:12.702764 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b89f88dc-7614-4f05-ad26-dc1d46d10b85/openstack-network-exporter/0.log" Mar 20 08:03:12 crc kubenswrapper[4749]: I0320 08:03:12.743476 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b89f88dc-7614-4f05-ad26-dc1d46d10b85/ovn-northd/0.log" Mar 20 08:03:12 crc kubenswrapper[4749]: I0320 08:03:12.889821 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a3723f09-8ae3-44e2-b5c7-7824e62755f7/openstack-network-exporter/0.log" Mar 20 08:03:12 crc kubenswrapper[4749]: I0320 08:03:12.919399 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a3723f09-8ae3-44e2-b5c7-7824e62755f7/ovsdbserver-nb/0.log" Mar 20 08:03:13 crc kubenswrapper[4749]: I0320 08:03:13.024619 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_541ae4e0-d6e7-4ad9-8451-8f5b840050de/openstack-network-exporter/0.log" Mar 20 08:03:13 crc kubenswrapper[4749]: I0320 08:03:13.061036 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_541ae4e0-d6e7-4ad9-8451-8f5b840050de/ovsdbserver-sb/0.log" Mar 20 08:03:13 crc kubenswrapper[4749]: I0320 08:03:13.140641 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8b9b402f-2d95-48f5-98d8-497d90956ba2/setup-container/0.log" Mar 20 08:03:13 crc kubenswrapper[4749]: I0320 08:03:13.274855 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8b9b402f-2d95-48f5-98d8-497d90956ba2/setup-container/0.log" Mar 20 08:03:13 crc kubenswrapper[4749]: I0320 08:03:13.283144 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8b9b402f-2d95-48f5-98d8-497d90956ba2/rabbitmq/10.log" Mar 20 08:03:13 crc kubenswrapper[4749]: I0320 08:03:13.291756 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8b9b402f-2d95-48f5-98d8-497d90956ba2/rabbitmq/10.log" Mar 20 08:03:13 crc kubenswrapper[4749]: I0320 08:03:13.440515 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8db06e36-0b00-4157-9345-69449da3e85f/setup-container/0.log" Mar 20 08:03:13 crc kubenswrapper[4749]: I0320 08:03:13.585213 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8db06e36-0b00-4157-9345-69449da3e85f/setup-container/0.log" Mar 20 08:03:13 crc kubenswrapper[4749]: I0320 08:03:13.590263 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8db06e36-0b00-4157-9345-69449da3e85f/rabbitmq/10.log" Mar 20 08:03:13 crc kubenswrapper[4749]: I0320 08:03:13.624627 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8db06e36-0b00-4157-9345-69449da3e85f/rabbitmq/10.log" Mar 20 08:03:13 crc kubenswrapper[4749]: I0320 08:03:13.755228 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-dgbxq_3adcfcfa-0ea4-4c5e-9e57-957538c1469e/swift-ring-rebalance/0.log" Mar 20 08:03:13 crc kubenswrapper[4749]: I0320 08:03:13.819073 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8272956e-b31a-4bd8-9118-3ca9721e6d75/account-auditor/0.log" Mar 20 08:03:13 crc kubenswrapper[4749]: I0320 08:03:13.826845 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8272956e-b31a-4bd8-9118-3ca9721e6d75/account-reaper/0.log" Mar 20 08:03:13 crc kubenswrapper[4749]: I0320 08:03:13.958313 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8272956e-b31a-4bd8-9118-3ca9721e6d75/account-replicator/0.log" Mar 20 08:03:13 crc kubenswrapper[4749]: I0320 08:03:13.979227 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8272956e-b31a-4bd8-9118-3ca9721e6d75/account-server/0.log" Mar 20 08:03:14 crc kubenswrapper[4749]: I0320 08:03:14.001080 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8272956e-b31a-4bd8-9118-3ca9721e6d75/container-auditor/0.log" Mar 20 08:03:14 crc kubenswrapper[4749]: I0320 08:03:14.040466 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8272956e-b31a-4bd8-9118-3ca9721e6d75/container-replicator/0.log" Mar 20 08:03:14 crc kubenswrapper[4749]: I0320 08:03:14.117157 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8272956e-b31a-4bd8-9118-3ca9721e6d75/container-updater/0.log" Mar 20 08:03:14 crc kubenswrapper[4749]: I0320 08:03:14.149036 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8272956e-b31a-4bd8-9118-3ca9721e6d75/object-auditor/0.log" Mar 20 08:03:14 crc kubenswrapper[4749]: I0320 08:03:14.160168 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8272956e-b31a-4bd8-9118-3ca9721e6d75/container-server/0.log" Mar 20 08:03:14 crc kubenswrapper[4749]: I0320 08:03:14.205032 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8272956e-b31a-4bd8-9118-3ca9721e6d75/object-expirer/0.log" Mar 20 08:03:14 crc kubenswrapper[4749]: I0320 08:03:14.296594 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8272956e-b31a-4bd8-9118-3ca9721e6d75/object-replicator/0.log" Mar 20 08:03:14 crc kubenswrapper[4749]: I0320 08:03:14.320795 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8272956e-b31a-4bd8-9118-3ca9721e6d75/object-server/0.log" Mar 20 08:03:14 crc kubenswrapper[4749]: I0320 08:03:14.343178 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8272956e-b31a-4bd8-9118-3ca9721e6d75/object-updater/0.log" Mar 20 08:03:14 crc kubenswrapper[4749]: I0320 08:03:14.415028 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8272956e-b31a-4bd8-9118-3ca9721e6d75/rsync/0.log" Mar 20 08:03:14 crc kubenswrapper[4749]: I0320 08:03:14.475537 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8272956e-b31a-4bd8-9118-3ca9721e6d75/swift-recon-cron/0.log" Mar 20 08:03:15 crc kubenswrapper[4749]: I0320 08:03:15.177679 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:03:15 crc kubenswrapper[4749]: E0320 08:03:15.178111 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:03:21 crc kubenswrapper[4749]: I0320 08:03:21.178187 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 08:03:21 crc kubenswrapper[4749]: E0320 08:03:21.180859 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:03:26 crc kubenswrapper[4749]: I0320 08:03:26.178278 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:03:26 crc kubenswrapper[4749]: E0320 08:03:26.179498 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:03:27 crc kubenswrapper[4749]: I0320 08:03:27.177841 4749 scope.go:117] "RemoveContainer" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" Mar 20 08:03:27 crc kubenswrapper[4749]: E0320 08:03:27.178105 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:03:33 crc kubenswrapper[4749]: I0320 08:03:33.178168 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 08:03:33 crc kubenswrapper[4749]: E0320 08:03:33.178978 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:03:33 crc kubenswrapper[4749]: I0320 08:03:33.191829 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-bbhfb_99111621-16af-4be2-b4d4-ce9b82e41165/manager/0.log" Mar 20 08:03:33 crc kubenswrapper[4749]: I0320 08:03:33.421377 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7_a1525cb6-0a04-499c-8737-81f1981815da/util/0.log" Mar 20 08:03:33 crc kubenswrapper[4749]: I0320 08:03:33.594226 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7_a1525cb6-0a04-499c-8737-81f1981815da/util/0.log" Mar 20 08:03:33 crc kubenswrapper[4749]: I0320 08:03:33.625777 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7_a1525cb6-0a04-499c-8737-81f1981815da/pull/0.log" Mar 20 08:03:33 crc kubenswrapper[4749]: I0320 08:03:33.649692 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7_a1525cb6-0a04-499c-8737-81f1981815da/pull/0.log" Mar 20 08:03:33 crc kubenswrapper[4749]: I0320 08:03:33.779078 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7_a1525cb6-0a04-499c-8737-81f1981815da/util/0.log" Mar 20 08:03:33 crc kubenswrapper[4749]: I0320 08:03:33.819625 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7_a1525cb6-0a04-499c-8737-81f1981815da/pull/0.log" Mar 20 08:03:33 crc kubenswrapper[4749]: I0320 08:03:33.851343 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_be367e3118afbcc92c106aa60ad89a847599882d02c81ac0e6d7ee6cbcpg7z7_a1525cb6-0a04-499c-8737-81f1981815da/extract/0.log" Mar 20 08:03:34 crc kubenswrapper[4749]: I0320 08:03:34.232372 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-mfsfk_26434b2d-c04d-42b7-9631-6d0851886141/manager/0.log" Mar 20 08:03:34 crc kubenswrapper[4749]: I0320 08:03:34.455632 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-t97b7_06c975b5-ec27-4ff9-b7bb-115c12275ac2/manager/0.log" Mar 20 08:03:34 crc kubenswrapper[4749]: I0320 08:03:34.462251 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-9t9xf_88958bd4-4087-4f7c-b72e-9c2cea412993/manager/0.log" Mar 20 08:03:34 crc kubenswrapper[4749]: I0320 08:03:34.640628 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-gdxrh_b97ffca4-e4a1-4fbf-8271-d97410ffa49a/manager/0.log" Mar 20 08:03:34 crc kubenswrapper[4749]: I0320 08:03:34.684019 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-7bpmp_a56dfc81-4a0f-4e99-a884-cff054d164b9/manager/0.log" Mar 20 08:03:34 crc kubenswrapper[4749]: I0320 08:03:34.915691 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-5jlcc_cee7836b-e12f-4de9-be6b-4caa60294269/manager/0.log" Mar 20 08:03:34 crc kubenswrapper[4749]: I0320 08:03:34.954021 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5d9899ccc6-2x44r_5af1049c-beed-4d2a-93da-95171c0142e3/manager/0.log" Mar 20 08:03:35 crc kubenswrapper[4749]: I0320 08:03:35.106708 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-zpwsq_fa996ed9-64cd-4371-80e7-8122c77285fc/manager/0.log" Mar 20 08:03:35 crc kubenswrapper[4749]: I0320 08:03:35.142849 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-fpgkx_a0cd89a4-110c-4df5-b9ce-186f38d9be30/manager/0.log" Mar 20 08:03:35 crc kubenswrapper[4749]: I0320 08:03:35.315510 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-mscpf_6b2dc985-5b75-4bc6-8c79-392034f38960/manager/0.log" Mar 20 08:03:35 crc kubenswrapper[4749]: I0320 08:03:35.355951 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-548s6_cd82533b-5f9e-45e3-a645-90e678bcbf4a/manager/0.log" Mar 20 08:03:35 crc kubenswrapper[4749]: I0320 08:03:35.487683 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-6kn5g_a0e5f3af-b138-43f6-b007-ca56ec51851c/manager/0.log" Mar 20 08:03:35 crc kubenswrapper[4749]: I0320 08:03:35.562452 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-dhsjc_f5230399-dbb2-4a03-afcb-58dd2c1fdd22/manager/0.log" Mar 20 08:03:35 crc kubenswrapper[4749]: I0320 08:03:35.630879 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-4qwrj_8ba67eb0-3c0d-4558-b603-3626f3980dad/manager/0.log" Mar 20 08:03:35 crc kubenswrapper[4749]: I0320 08:03:35.855039 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7f68b6bcd8-5cmds_1cf0a010-6087-4784-8303-8be78ad550e1/operator/0.log" Mar 20 08:03:36 crc kubenswrapper[4749]: I0320 08:03:36.055756 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-zm7gn_14b1dfbd-5576-43ed-b482-da48c031840a/registry-server/0.log" Mar 20 08:03:36 crc kubenswrapper[4749]: I0320 08:03:36.077432 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-65c7c8696f-s7w78_461831bb-9c93-49f8-a32e-ec01c4bdc549/manager/0.log" Mar 20 08:03:36 crc kubenswrapper[4749]: I0320 08:03:36.256217 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-nqz8s_9d6d1c42-480e-49ac-8a40-233fb95e4a0a/manager/0.log" Mar 20 08:03:36 crc kubenswrapper[4749]: I0320 08:03:36.257515 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-kmzst_b4c290c3-309d-4706-935a-0e33bf4e403b/manager/0.log" Mar 20 08:03:36 crc kubenswrapper[4749]: I0320 08:03:36.415898 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-86v48_589f626e-af46-4f5e-98f6-d4ad787f84d8/operator/0.log" Mar 20 08:03:36 crc kubenswrapper[4749]: I0320 08:03:36.482190 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-289b6_a15a8919-b4d7-418a-b725-38e7d7b0e859/manager/0.log" Mar 20 08:03:36 crc kubenswrapper[4749]: I0320 08:03:36.660606 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-p867r_75ab6716-99ca-4fd9-a632-0bc69d5c3742/manager/0.log" Mar 20 08:03:36 crc kubenswrapper[4749]: I0320 08:03:36.674891 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-dq6v4_7161b86e-8178-40db-a6a3-71f724746aed/manager/0.log" Mar 20 08:03:36 crc kubenswrapper[4749]: I0320 08:03:36.851413 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-6pbbk_ddbd9da2-a48e-4e49-894e-4a9ae1109a73/manager/0.log" Mar 20 08:03:37 crc kubenswrapper[4749]: I0320 08:03:37.177252 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:03:37 crc kubenswrapper[4749]: E0320 08:03:37.177529 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:03:42 crc kubenswrapper[4749]: I0320 08:03:42.177414 4749 scope.go:117] "RemoveContainer" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" Mar 20 08:03:42 crc kubenswrapper[4749]: E0320 08:03:42.178033 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:03:47 crc kubenswrapper[4749]: I0320 08:03:47.178528 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 08:03:47 crc kubenswrapper[4749]: E0320 08:03:47.179159 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:03:50 crc kubenswrapper[4749]: I0320 08:03:50.185223 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:03:50 crc kubenswrapper[4749]: E0320 08:03:50.186110 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:03:54 crc kubenswrapper[4749]: I0320 08:03:54.183452 4749 scope.go:117] "RemoveContainer" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" Mar 20 08:03:54 crc kubenswrapper[4749]: E0320 08:03:54.183890 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:03:56 crc kubenswrapper[4749]: I0320 08:03:56.701450 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-6sv2z_da30337a-26c0-4b0b-beb5-c46c48facfc6/control-plane-machine-set-operator/0.log" Mar 20 08:03:56 crc kubenswrapper[4749]: I0320 08:03:56.870016 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-phw2k_ecc1f279-eced-4b51-8ded-b7d00d089722/machine-api-operator/0.log" Mar 20 08:03:56 crc kubenswrapper[4749]: I0320 08:03:56.929813 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-phw2k_ecc1f279-eced-4b51-8ded-b7d00d089722/kube-rbac-proxy/0.log" Mar 20 08:04:00 crc kubenswrapper[4749]: I0320 08:04:00.150074 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566564-29tb5"] Mar 20 08:04:00 crc kubenswrapper[4749]: E0320 08:04:00.150942 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d" containerName="container-00" Mar 20 08:04:00 crc kubenswrapper[4749]: I0320 08:04:00.150958 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d" containerName="container-00" Mar 20 08:04:00 crc kubenswrapper[4749]: I0320 08:04:00.151217 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fd7b424-f9e0-4da9-bcbb-5b522a6e5c6d" containerName="container-00" Mar 20 08:04:00 crc kubenswrapper[4749]: I0320 08:04:00.151888 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566564-29tb5" Mar 20 08:04:00 crc kubenswrapper[4749]: I0320 08:04:00.155866 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:04:00 crc kubenswrapper[4749]: I0320 08:04:00.155866 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:04:00 crc kubenswrapper[4749]: I0320 08:04:00.155971 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 08:04:00 crc kubenswrapper[4749]: I0320 08:04:00.158040 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566564-29tb5"] Mar 20 08:04:00 crc kubenswrapper[4749]: I0320 08:04:00.177043 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 08:04:00 crc kubenswrapper[4749]: E0320 08:04:00.177344 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:04:00 crc kubenswrapper[4749]: I0320 08:04:00.218484 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvg8b\" (UniqueName: \"kubernetes.io/projected/94272621-96be-4886-b86e-0d70244122c6-kube-api-access-lvg8b\") pod \"auto-csr-approver-29566564-29tb5\" (UID: \"94272621-96be-4886-b86e-0d70244122c6\") " pod="openshift-infra/auto-csr-approver-29566564-29tb5" Mar 20 08:04:00 crc kubenswrapper[4749]: I0320 08:04:00.320860 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvg8b\" (UniqueName: \"kubernetes.io/projected/94272621-96be-4886-b86e-0d70244122c6-kube-api-access-lvg8b\") pod \"auto-csr-approver-29566564-29tb5\" (UID: \"94272621-96be-4886-b86e-0d70244122c6\") " pod="openshift-infra/auto-csr-approver-29566564-29tb5" Mar 20 08:04:00 crc kubenswrapper[4749]: I0320 08:04:00.345649 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvg8b\" (UniqueName: \"kubernetes.io/projected/94272621-96be-4886-b86e-0d70244122c6-kube-api-access-lvg8b\") pod \"auto-csr-approver-29566564-29tb5\" (UID: \"94272621-96be-4886-b86e-0d70244122c6\") " pod="openshift-infra/auto-csr-approver-29566564-29tb5" Mar 20 08:04:00 crc kubenswrapper[4749]: I0320 08:04:00.485213 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566564-29tb5" Mar 20 08:04:00 crc kubenswrapper[4749]: I0320 08:04:00.968856 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566564-29tb5"] Mar 20 08:04:00 crc kubenswrapper[4749]: W0320 08:04:00.973695 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94272621_96be_4886_b86e_0d70244122c6.slice/crio-18c98fccf1b275bd24174360e6f975dcfb3327f95cc5a700d22378d83d57c4f0 WatchSource:0}: Error finding container 18c98fccf1b275bd24174360e6f975dcfb3327f95cc5a700d22378d83d57c4f0: Status 404 returned error can't find the container with id 18c98fccf1b275bd24174360e6f975dcfb3327f95cc5a700d22378d83d57c4f0 Mar 20 08:04:01 crc kubenswrapper[4749]: I0320 08:04:01.003799 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566564-29tb5" event={"ID":"94272621-96be-4886-b86e-0d70244122c6","Type":"ContainerStarted","Data":"18c98fccf1b275bd24174360e6f975dcfb3327f95cc5a700d22378d83d57c4f0"} Mar 20 08:04:03 crc kubenswrapper[4749]: I0320 08:04:03.021624 4749 generic.go:334] "Generic (PLEG): container finished" podID="94272621-96be-4886-b86e-0d70244122c6" containerID="85f139e53131454461490bda16d0622dd54e31f7eb3a9c220d410c41eb226544" exitCode=0 Mar 20 08:04:03 crc kubenswrapper[4749]: I0320 08:04:03.021717 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566564-29tb5" event={"ID":"94272621-96be-4886-b86e-0d70244122c6","Type":"ContainerDied","Data":"85f139e53131454461490bda16d0622dd54e31f7eb3a9c220d410c41eb226544"} Mar 20 08:04:03 crc kubenswrapper[4749]: I0320 08:04:03.177479 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:04:03 crc kubenswrapper[4749]: E0320 08:04:03.177814 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:04:04 crc kubenswrapper[4749]: I0320 08:04:04.326095 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566564-29tb5" Mar 20 08:04:04 crc kubenswrapper[4749]: I0320 08:04:04.398892 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvg8b\" (UniqueName: \"kubernetes.io/projected/94272621-96be-4886-b86e-0d70244122c6-kube-api-access-lvg8b\") pod \"94272621-96be-4886-b86e-0d70244122c6\" (UID: \"94272621-96be-4886-b86e-0d70244122c6\") " Mar 20 08:04:04 crc kubenswrapper[4749]: I0320 08:04:04.405061 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94272621-96be-4886-b86e-0d70244122c6-kube-api-access-lvg8b" (OuterVolumeSpecName: "kube-api-access-lvg8b") pod "94272621-96be-4886-b86e-0d70244122c6" (UID: "94272621-96be-4886-b86e-0d70244122c6"). InnerVolumeSpecName "kube-api-access-lvg8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:04:04 crc kubenswrapper[4749]: I0320 08:04:04.501235 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvg8b\" (UniqueName: \"kubernetes.io/projected/94272621-96be-4886-b86e-0d70244122c6-kube-api-access-lvg8b\") on node \"crc\" DevicePath \"\"" Mar 20 08:04:05 crc kubenswrapper[4749]: I0320 08:04:05.037945 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566564-29tb5" event={"ID":"94272621-96be-4886-b86e-0d70244122c6","Type":"ContainerDied","Data":"18c98fccf1b275bd24174360e6f975dcfb3327f95cc5a700d22378d83d57c4f0"} Mar 20 08:04:05 crc kubenswrapper[4749]: I0320 08:04:05.037999 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18c98fccf1b275bd24174360e6f975dcfb3327f95cc5a700d22378d83d57c4f0" Mar 20 08:04:05 crc kubenswrapper[4749]: I0320 08:04:05.038017 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566564-29tb5" Mar 20 08:04:05 crc kubenswrapper[4749]: I0320 08:04:05.412174 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566558-j4rfz"] Mar 20 08:04:05 crc kubenswrapper[4749]: I0320 08:04:05.417741 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566558-j4rfz"] Mar 20 08:04:06 crc kubenswrapper[4749]: I0320 08:04:06.188556 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d894f2e-7d4b-4e30-bfb2-9e50ed440a9e" path="/var/lib/kubelet/pods/9d894f2e-7d4b-4e30-bfb2-9e50ed440a9e/volumes" Mar 20 08:04:08 crc kubenswrapper[4749]: I0320 08:04:08.177820 4749 scope.go:117] "RemoveContainer" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" Mar 20 08:04:09 crc kubenswrapper[4749]: I0320 08:04:09.077663 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerStarted","Data":"9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8"} Mar 20 08:04:09 crc kubenswrapper[4749]: I0320 08:04:09.078521 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:04:11 crc kubenswrapper[4749]: I0320 08:04:11.044141 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-p7gg4_41f8fcf3-85bc-4ff0-926c-857f426fa501/cert-manager-controller/0.log" Mar 20 08:04:11 crc kubenswrapper[4749]: I0320 08:04:11.177051 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 08:04:11 crc kubenswrapper[4749]: E0320 08:04:11.177271 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:04:11 crc kubenswrapper[4749]: I0320 08:04:11.194720 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-9pjlm_d10c7079-87d2-41c4-acda-82bc9d8365d2/cert-manager-cainjector/0.log" Mar 20 08:04:11 crc kubenswrapper[4749]: I0320 08:04:11.300785 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-grns2_8be44205-7bc6-4802-addc-996357e9ffd0/cert-manager-webhook/0.log" Mar 20 08:04:12 crc kubenswrapper[4749]: E0320 08:04:12.453838 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b9b402f_2d95_48f5_98d8_497d90956ba2.slice/crio-9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b9b402f_2d95_48f5_98d8_497d90956ba2.slice/crio-conmon-9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8.scope\": RecentStats: unable to find data in memory cache]" Mar 20 08:04:13 crc kubenswrapper[4749]: I0320 08:04:13.105430 4749 generic.go:334] "Generic (PLEG): container finished" podID="8b9b402f-2d95-48f5-98d8-497d90956ba2" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" exitCode=0 Mar 20 08:04:13 crc kubenswrapper[4749]: I0320 08:04:13.105479 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerDied","Data":"9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8"} Mar 20 08:04:13 crc kubenswrapper[4749]: I0320 08:04:13.105515 4749 scope.go:117] "RemoveContainer" containerID="cc4625b9209797aa379352fd9f58107f7e42d2b4f479b4f5c1a0f1d6334d65a3" Mar 20 08:04:13 crc kubenswrapper[4749]: I0320 08:04:13.106145 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:04:13 crc kubenswrapper[4749]: E0320 08:04:13.106551 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:04:17 crc kubenswrapper[4749]: I0320 08:04:17.176915 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:04:17 crc kubenswrapper[4749]: E0320 08:04:17.177539 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:04:23 crc kubenswrapper[4749]: I0320 08:04:23.177563 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 08:04:24 crc kubenswrapper[4749]: I0320 08:04:24.229579 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerStarted","Data":"621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015"} Mar 20 08:04:24 crc kubenswrapper[4749]: I0320 08:04:24.230138 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 08:04:24 crc kubenswrapper[4749]: I0320 08:04:24.474306 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-nzfz2_e8b78869-90fe-4c18-9cc0-6605ad9ecdbc/nmstate-console-plugin/0.log" Mar 20 08:04:24 crc kubenswrapper[4749]: I0320 08:04:24.714075 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-cmf78_74b419be-dfe1-4b8c-a6b1-79b50e32b335/nmstate-handler/0.log" Mar 20 08:04:24 crc kubenswrapper[4749]: I0320 08:04:24.799125 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-ggtqr_80061915-6756-40d7-9f66-71248a0255dd/kube-rbac-proxy/0.log" Mar 20 08:04:24 crc kubenswrapper[4749]: I0320 08:04:24.853359 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-ggtqr_80061915-6756-40d7-9f66-71248a0255dd/nmstate-metrics/0.log" Mar 20 08:04:24 crc kubenswrapper[4749]: I0320 08:04:24.871216 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-hdj6p_31193cc4-cc66-42d3-9029-2fcd90d1d9bc/nmstate-operator/0.log" Mar 20 08:04:25 crc kubenswrapper[4749]: I0320 08:04:25.015996 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-qzwd2_45ebdb11-6585-4874-986c-7a5f0e456e26/nmstate-webhook/0.log" Mar 20 08:04:26 crc kubenswrapper[4749]: I0320 08:04:26.177336 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:04:26 crc kubenswrapper[4749]: E0320 08:04:26.179023 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:04:27 crc kubenswrapper[4749]: I0320 08:04:27.259125 4749 generic.go:334] "Generic (PLEG): container finished" podID="8db06e36-0b00-4157-9345-69449da3e85f" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" exitCode=0 Mar 20 08:04:27 crc kubenswrapper[4749]: I0320 08:04:27.259174 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerDied","Data":"621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015"} Mar 20 08:04:27 crc kubenswrapper[4749]: I0320 08:04:27.259212 4749 scope.go:117] "RemoveContainer" containerID="ae3deabce868852ca8b2c75b310320888cefe3d1f8956d9347d0fcd6a7d22838" Mar 20 08:04:27 crc kubenswrapper[4749]: I0320 08:04:27.260258 4749 scope.go:117] "RemoveContainer" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" Mar 20 08:04:27 crc kubenswrapper[4749]: E0320 08:04:27.260927 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:04:28 crc kubenswrapper[4749]: I0320 08:04:28.178404 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:04:28 crc kubenswrapper[4749]: E0320 08:04:28.179570 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:04:38 crc kubenswrapper[4749]: I0320 08:04:38.177495 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:04:38 crc kubenswrapper[4749]: E0320 08:04:38.178376 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:04:40 crc kubenswrapper[4749]: I0320 08:04:40.177522 4749 scope.go:117] "RemoveContainer" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" Mar 20 08:04:40 crc kubenswrapper[4749]: E0320 08:04:40.178197 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:04:41 crc kubenswrapper[4749]: I0320 08:04:41.177351 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:04:41 crc kubenswrapper[4749]: E0320 08:04:41.178032 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:04:51 crc kubenswrapper[4749]: I0320 08:04:51.177871 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:04:51 crc kubenswrapper[4749]: E0320 08:04:51.179000 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:04:53 crc kubenswrapper[4749]: I0320 08:04:53.285330 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-d7bq6_b50e6e0c-7ab7-4516-929a-49e48e00c1e2/kube-rbac-proxy/0.log" Mar 20 08:04:53 crc kubenswrapper[4749]: I0320 08:04:53.390124 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-d7bq6_b50e6e0c-7ab7-4516-929a-49e48e00c1e2/controller/0.log" Mar 20 08:04:53 crc kubenswrapper[4749]: I0320 08:04:53.493496 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ww9w_82ef576a-cd53-4b86-8ddc-e2528fa1b23d/cp-frr-files/0.log" Mar 20 08:04:53 crc kubenswrapper[4749]: I0320 08:04:53.631351 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ww9w_82ef576a-cd53-4b86-8ddc-e2528fa1b23d/cp-frr-files/0.log" Mar 20 08:04:53 crc kubenswrapper[4749]: I0320 08:04:53.660449 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ww9w_82ef576a-cd53-4b86-8ddc-e2528fa1b23d/cp-reloader/0.log" Mar 20 08:04:53 crc kubenswrapper[4749]: I0320 08:04:53.666481 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ww9w_82ef576a-cd53-4b86-8ddc-e2528fa1b23d/cp-reloader/0.log" Mar 20 08:04:53 crc kubenswrapper[4749]: I0320 08:04:53.701923 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ww9w_82ef576a-cd53-4b86-8ddc-e2528fa1b23d/cp-metrics/0.log" Mar 20 08:04:53 crc kubenswrapper[4749]: I0320 08:04:53.868191 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ww9w_82ef576a-cd53-4b86-8ddc-e2528fa1b23d/cp-metrics/0.log" Mar 20 08:04:53 crc kubenswrapper[4749]: I0320 08:04:53.873341 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ww9w_82ef576a-cd53-4b86-8ddc-e2528fa1b23d/cp-reloader/0.log" Mar 20 08:04:53 crc kubenswrapper[4749]: I0320 08:04:53.880820 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ww9w_82ef576a-cd53-4b86-8ddc-e2528fa1b23d/cp-frr-files/0.log" Mar 20 08:04:53 crc kubenswrapper[4749]: I0320 08:04:53.929644 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ww9w_82ef576a-cd53-4b86-8ddc-e2528fa1b23d/cp-metrics/0.log" Mar 20 08:04:54 crc kubenswrapper[4749]: I0320 08:04:54.045356 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ww9w_82ef576a-cd53-4b86-8ddc-e2528fa1b23d/cp-reloader/0.log" Mar 20 08:04:54 crc kubenswrapper[4749]: I0320 08:04:54.082297 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ww9w_82ef576a-cd53-4b86-8ddc-e2528fa1b23d/cp-metrics/0.log" Mar 20 08:04:54 crc kubenswrapper[4749]: I0320 08:04:54.112574 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ww9w_82ef576a-cd53-4b86-8ddc-e2528fa1b23d/cp-frr-files/0.log" Mar 20 08:04:54 crc kubenswrapper[4749]: I0320 08:04:54.116214 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ww9w_82ef576a-cd53-4b86-8ddc-e2528fa1b23d/controller/0.log" Mar 20 08:04:54 crc kubenswrapper[4749]: I0320 08:04:54.181506 4749 scope.go:117] "RemoveContainer" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" Mar 20 08:04:54 crc kubenswrapper[4749]: E0320 08:04:54.181916 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:04:54 crc kubenswrapper[4749]: I0320 08:04:54.183366 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:04:54 crc kubenswrapper[4749]: E0320 08:04:54.183721 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:04:54 crc kubenswrapper[4749]: I0320 08:04:54.281866 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ww9w_82ef576a-cd53-4b86-8ddc-e2528fa1b23d/frr-metrics/0.log" Mar 20 08:04:54 crc kubenswrapper[4749]: I0320 08:04:54.301299 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ww9w_82ef576a-cd53-4b86-8ddc-e2528fa1b23d/kube-rbac-proxy/0.log" Mar 20 08:04:54 crc kubenswrapper[4749]: I0320 08:04:54.358272 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ww9w_82ef576a-cd53-4b86-8ddc-e2528fa1b23d/kube-rbac-proxy-frr/0.log" Mar 20 08:04:54 crc kubenswrapper[4749]: I0320 08:04:54.500791 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ww9w_82ef576a-cd53-4b86-8ddc-e2528fa1b23d/reloader/0.log" Mar 20 08:04:54 crc kubenswrapper[4749]: I0320 08:04:54.727593 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ww9w_82ef576a-cd53-4b86-8ddc-e2528fa1b23d/frr/0.log" Mar 20 08:04:54 crc kubenswrapper[4749]: I0320 08:04:54.750621 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-5p9db_b235c243-14bb-4dba-9905-5bd230ae2879/frr-k8s-webhook-server/0.log" Mar 20 08:04:54 crc kubenswrapper[4749]: I0320 08:04:54.866953 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-789686cc7-8s64f_2cee17f1-fcc8-4ae8-aafe-d7eebabbe966/manager/0.log" Mar 20 08:04:54 crc kubenswrapper[4749]: I0320 08:04:54.958749 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-784cc76666-86nb4_5201be3d-1ed8-4536-a235-1adc0203c10e/webhook-server/0.log" Mar 20 08:04:55 crc kubenswrapper[4749]: I0320 08:04:55.067178 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-s5f42_ae069ce2-c4d3-434f-8c66-95a75561bf8b/kube-rbac-proxy/0.log" Mar 20 08:04:55 crc kubenswrapper[4749]: I0320 08:04:55.273326 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-s5f42_ae069ce2-c4d3-434f-8c66-95a75561bf8b/speaker/0.log" Mar 20 08:04:56 crc kubenswrapper[4749]: I0320 08:04:56.964847 4749 scope.go:117] "RemoveContainer" containerID="99df9f31376559c47d49dd830959e482f2c285c17b804335d85dd870efc37e5c" Mar 20 08:05:05 crc kubenswrapper[4749]: I0320 08:05:05.177628 4749 scope.go:117] "RemoveContainer" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" Mar 20 08:05:05 crc kubenswrapper[4749]: E0320 08:05:05.179980 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:05:06 crc kubenswrapper[4749]: I0320 08:05:06.178151 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:05:06 crc kubenswrapper[4749]: E0320 08:05:06.178501 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:05:07 crc kubenswrapper[4749]: I0320 08:05:07.177809 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:05:07 crc kubenswrapper[4749]: E0320 08:05:07.178054 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:05:09 crc kubenswrapper[4749]: I0320 08:05:09.224154 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld_68f28f80-0a90-4e42-ac6e-66fff5eec59a/util/0.log" Mar 20 08:05:09 crc kubenswrapper[4749]: I0320 08:05:09.591911 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld_68f28f80-0a90-4e42-ac6e-66fff5eec59a/pull/0.log" Mar 20 08:05:09 crc kubenswrapper[4749]: I0320 08:05:09.622266 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld_68f28f80-0a90-4e42-ac6e-66fff5eec59a/pull/0.log" Mar 20 08:05:09 crc kubenswrapper[4749]: I0320 08:05:09.659481 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld_68f28f80-0a90-4e42-ac6e-66fff5eec59a/util/0.log" Mar 20 08:05:09 crc kubenswrapper[4749]: I0320 08:05:09.782982 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld_68f28f80-0a90-4e42-ac6e-66fff5eec59a/pull/0.log" Mar 20 08:05:09 crc kubenswrapper[4749]: I0320 08:05:09.817972 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld_68f28f80-0a90-4e42-ac6e-66fff5eec59a/util/0.log" Mar 20 08:05:09 crc kubenswrapper[4749]: I0320 08:05:09.846841 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n7mld_68f28f80-0a90-4e42-ac6e-66fff5eec59a/extract/0.log" Mar 20 08:05:09 crc kubenswrapper[4749]: I0320 08:05:09.966792 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9_0d7f85fc-7895-4f42-8cc8-587c5f7f0f21/util/0.log" Mar 20 08:05:10 crc kubenswrapper[4749]: I0320 08:05:10.135247 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9_0d7f85fc-7895-4f42-8cc8-587c5f7f0f21/util/0.log" Mar 20 08:05:10 crc kubenswrapper[4749]: I0320 08:05:10.145144 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9_0d7f85fc-7895-4f42-8cc8-587c5f7f0f21/pull/0.log" Mar 20 08:05:10 crc kubenswrapper[4749]: I0320 08:05:10.147760 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9_0d7f85fc-7895-4f42-8cc8-587c5f7f0f21/pull/0.log" Mar 20 08:05:10 crc kubenswrapper[4749]: I0320 08:05:10.345869 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9_0d7f85fc-7895-4f42-8cc8-587c5f7f0f21/util/0.log" Mar 20 08:05:10 crc kubenswrapper[4749]: I0320 08:05:10.403357 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9_0d7f85fc-7895-4f42-8cc8-587c5f7f0f21/pull/0.log" Mar 20 08:05:10 crc kubenswrapper[4749]: I0320 08:05:10.427120 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c16xgc9_0d7f85fc-7895-4f42-8cc8-587c5f7f0f21/extract/0.log" Mar 20 08:05:10 crc kubenswrapper[4749]: I0320 08:05:10.507393 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dn6gk_e86bd4a9-6a7b-432d-824c-e03199a458f6/extract-utilities/0.log" Mar 20 08:05:10 crc kubenswrapper[4749]: I0320 08:05:10.698266 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dn6gk_e86bd4a9-6a7b-432d-824c-e03199a458f6/extract-content/0.log" Mar 20 08:05:10 crc kubenswrapper[4749]: I0320 08:05:10.703493 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dn6gk_e86bd4a9-6a7b-432d-824c-e03199a458f6/extract-utilities/0.log" Mar 20 08:05:10 crc kubenswrapper[4749]: I0320 08:05:10.729781 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dn6gk_e86bd4a9-6a7b-432d-824c-e03199a458f6/extract-content/0.log" Mar 20 08:05:10 crc kubenswrapper[4749]: I0320 08:05:10.886008 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dn6gk_e86bd4a9-6a7b-432d-824c-e03199a458f6/extract-content/0.log" Mar 20 08:05:10 crc kubenswrapper[4749]: I0320 08:05:10.920051 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dn6gk_e86bd4a9-6a7b-432d-824c-e03199a458f6/extract-utilities/0.log" Mar 20 08:05:11 crc kubenswrapper[4749]: I0320 08:05:11.067116 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p9d7m_69af4d7d-d164-4541-b2cf-edc3ce20af02/extract-utilities/0.log" Mar 20 08:05:11 crc kubenswrapper[4749]: I0320 08:05:11.304780 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p9d7m_69af4d7d-d164-4541-b2cf-edc3ce20af02/extract-utilities/0.log" Mar 20 08:05:11 crc kubenswrapper[4749]: I0320 08:05:11.336927 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p9d7m_69af4d7d-d164-4541-b2cf-edc3ce20af02/extract-content/0.log" Mar 20 08:05:11 crc kubenswrapper[4749]: I0320 08:05:11.405909 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dn6gk_e86bd4a9-6a7b-432d-824c-e03199a458f6/registry-server/0.log" Mar 20 08:05:11 crc kubenswrapper[4749]: I0320 08:05:11.408524 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p9d7m_69af4d7d-d164-4541-b2cf-edc3ce20af02/extract-content/0.log" Mar 20 08:05:11 crc kubenswrapper[4749]: I0320 08:05:11.541721 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p9d7m_69af4d7d-d164-4541-b2cf-edc3ce20af02/extract-content/0.log" Mar 20 08:05:11 crc kubenswrapper[4749]: I0320 08:05:11.561842 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p9d7m_69af4d7d-d164-4541-b2cf-edc3ce20af02/extract-utilities/0.log" Mar 20 08:05:11 crc kubenswrapper[4749]: I0320 08:05:11.806139 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p9d7m_69af4d7d-d164-4541-b2cf-edc3ce20af02/registry-server/0.log" Mar 20 08:05:11 crc kubenswrapper[4749]: I0320 08:05:11.810078 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-9lpwm_ab07be1c-a7c8-4310-b2be-7dea01a4a55b/marketplace-operator/0.log" Mar 20 08:05:11 crc kubenswrapper[4749]: I0320 08:05:11.855375 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t2l62_abf5fd3e-fb60-488e-9907-02dc8aa57901/extract-utilities/0.log" Mar 20 08:05:12 crc kubenswrapper[4749]: I0320 08:05:12.001497 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t2l62_abf5fd3e-fb60-488e-9907-02dc8aa57901/extract-content/0.log" Mar 20 08:05:12 crc kubenswrapper[4749]: I0320 08:05:12.004863 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t2l62_abf5fd3e-fb60-488e-9907-02dc8aa57901/extract-utilities/0.log" Mar 20 08:05:12 crc kubenswrapper[4749]: I0320 08:05:12.032777 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t2l62_abf5fd3e-fb60-488e-9907-02dc8aa57901/extract-content/0.log" Mar 20 08:05:12 crc kubenswrapper[4749]: I0320 08:05:12.241642 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t2l62_abf5fd3e-fb60-488e-9907-02dc8aa57901/extract-content/0.log" Mar 20 08:05:12 crc kubenswrapper[4749]: I0320 08:05:12.252241 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t2l62_abf5fd3e-fb60-488e-9907-02dc8aa57901/extract-utilities/0.log" Mar 20 08:05:12 crc kubenswrapper[4749]: I0320 08:05:12.332994 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t2l62_abf5fd3e-fb60-488e-9907-02dc8aa57901/registry-server/0.log" Mar 20 08:05:12 crc kubenswrapper[4749]: I0320 08:05:12.434264 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mt952_d3b30307-7d91-49c8-a5d4-79c1501c442f/extract-utilities/0.log" Mar 20 08:05:12 crc kubenswrapper[4749]: I0320 08:05:12.595617 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mt952_d3b30307-7d91-49c8-a5d4-79c1501c442f/extract-utilities/0.log" Mar 20 08:05:12 crc kubenswrapper[4749]: I0320 08:05:12.609329 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mt952_d3b30307-7d91-49c8-a5d4-79c1501c442f/extract-content/0.log" Mar 20 08:05:12 crc kubenswrapper[4749]: I0320 08:05:12.615000 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mt952_d3b30307-7d91-49c8-a5d4-79c1501c442f/extract-content/0.log" Mar 20 08:05:12 crc kubenswrapper[4749]: I0320 08:05:12.829875 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mt952_d3b30307-7d91-49c8-a5d4-79c1501c442f/extract-content/0.log" Mar 20 08:05:12 crc kubenswrapper[4749]: I0320 08:05:12.833131 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mt952_d3b30307-7d91-49c8-a5d4-79c1501c442f/extract-utilities/0.log" Mar 20 08:05:13 crc kubenswrapper[4749]: I0320 08:05:13.244827 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mt952_d3b30307-7d91-49c8-a5d4-79c1501c442f/registry-server/0.log" Mar 20 08:05:17 crc kubenswrapper[4749]: I0320 08:05:17.177695 4749 scope.go:117] "RemoveContainer" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" Mar 20 08:05:17 crc kubenswrapper[4749]: E0320 08:05:17.178561 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:05:20 crc kubenswrapper[4749]: I0320 08:05:20.177466 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:05:20 crc kubenswrapper[4749]: E0320 08:05:20.178058 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:05:20 crc kubenswrapper[4749]: I0320 08:05:20.178269 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:05:20 crc kubenswrapper[4749]: E0320 08:05:20.178675 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:05:31 crc kubenswrapper[4749]: I0320 08:05:31.177986 4749 scope.go:117] "RemoveContainer" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" Mar 20 08:05:31 crc kubenswrapper[4749]: I0320 08:05:31.178830 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:05:31 crc kubenswrapper[4749]: E0320 08:05:31.178954 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:05:31 crc kubenswrapper[4749]: E0320 08:05:31.179363 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:05:35 crc kubenswrapper[4749]: I0320 08:05:35.177316 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:05:35 crc kubenswrapper[4749]: E0320 08:05:35.178164 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:05:42 crc kubenswrapper[4749]: I0320 08:05:42.177925 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:05:42 crc kubenswrapper[4749]: E0320 08:05:42.179044 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:05:43 crc kubenswrapper[4749]: I0320 08:05:43.179020 4749 scope.go:117] "RemoveContainer" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" Mar 20 08:05:43 crc kubenswrapper[4749]: E0320 08:05:43.179425 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:05:48 crc kubenswrapper[4749]: I0320 08:05:48.177733 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:05:48 crc kubenswrapper[4749]: E0320 08:05:48.178904 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:05:57 crc kubenswrapper[4749]: I0320 08:05:57.177180 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:05:57 crc kubenswrapper[4749]: I0320 08:05:57.177845 4749 scope.go:117] "RemoveContainer" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" Mar 20 08:05:57 crc kubenswrapper[4749]: E0320 08:05:57.178026 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:05:57 crc kubenswrapper[4749]: E0320 08:05:57.178091 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:06:00 crc kubenswrapper[4749]: I0320 08:06:00.170870 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566566-ttnb7"] Mar 20 08:06:00 crc kubenswrapper[4749]: E0320 08:06:00.171852 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94272621-96be-4886-b86e-0d70244122c6" containerName="oc" Mar 20 08:06:00 crc kubenswrapper[4749]: I0320 08:06:00.171875 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="94272621-96be-4886-b86e-0d70244122c6" containerName="oc" Mar 20 08:06:00 crc kubenswrapper[4749]: I0320 08:06:00.172177 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="94272621-96be-4886-b86e-0d70244122c6" containerName="oc" Mar 20 08:06:00 crc kubenswrapper[4749]: I0320 08:06:00.173047 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566566-ttnb7" Mar 20 08:06:00 crc kubenswrapper[4749]: I0320 08:06:00.176063 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 08:06:00 crc kubenswrapper[4749]: I0320 08:06:00.178527 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:06:00 crc kubenswrapper[4749]: I0320 08:06:00.178847 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:06:00 crc kubenswrapper[4749]: I0320 08:06:00.204473 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566566-ttnb7"] Mar 20 08:06:00 crc kubenswrapper[4749]: I0320 08:06:00.306222 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8w86\" (UniqueName: \"kubernetes.io/projected/41b20e18-d460-4ff6-a1d4-a08137d44b8d-kube-api-access-t8w86\") pod \"auto-csr-approver-29566566-ttnb7\" (UID: \"41b20e18-d460-4ff6-a1d4-a08137d44b8d\") " pod="openshift-infra/auto-csr-approver-29566566-ttnb7" Mar 20 08:06:00 crc kubenswrapper[4749]: I0320 08:06:00.409329 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8w86\" (UniqueName: \"kubernetes.io/projected/41b20e18-d460-4ff6-a1d4-a08137d44b8d-kube-api-access-t8w86\") pod \"auto-csr-approver-29566566-ttnb7\" (UID: \"41b20e18-d460-4ff6-a1d4-a08137d44b8d\") " pod="openshift-infra/auto-csr-approver-29566566-ttnb7" Mar 20 08:06:00 crc kubenswrapper[4749]: I0320 08:06:00.429345 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8w86\" (UniqueName: \"kubernetes.io/projected/41b20e18-d460-4ff6-a1d4-a08137d44b8d-kube-api-access-t8w86\") pod \"auto-csr-approver-29566566-ttnb7\" (UID: \"41b20e18-d460-4ff6-a1d4-a08137d44b8d\") " pod="openshift-infra/auto-csr-approver-29566566-ttnb7" Mar 20 08:06:00 crc kubenswrapper[4749]: I0320 08:06:00.500868 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566566-ttnb7" Mar 20 08:06:01 crc kubenswrapper[4749]: I0320 08:06:01.001538 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566566-ttnb7"] Mar 20 08:06:01 crc kubenswrapper[4749]: W0320 08:06:01.028797 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41b20e18_d460_4ff6_a1d4_a08137d44b8d.slice/crio-43fc1e978d6b61b19d7da3a16473844cb41848ef2e9819bdd3742decf7ddf96d WatchSource:0}: Error finding container 43fc1e978d6b61b19d7da3a16473844cb41848ef2e9819bdd3742decf7ddf96d: Status 404 returned error can't find the container with id 43fc1e978d6b61b19d7da3a16473844cb41848ef2e9819bdd3742decf7ddf96d Mar 20 08:06:01 crc kubenswrapper[4749]: I0320 08:06:01.108207 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566566-ttnb7" event={"ID":"41b20e18-d460-4ff6-a1d4-a08137d44b8d","Type":"ContainerStarted","Data":"43fc1e978d6b61b19d7da3a16473844cb41848ef2e9819bdd3742decf7ddf96d"} Mar 20 08:06:02 crc kubenswrapper[4749]: I0320 08:06:02.178124 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:06:02 crc kubenswrapper[4749]: E0320 08:06:02.178481 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:06:03 crc kubenswrapper[4749]: I0320 08:06:03.129605 4749 generic.go:334] "Generic (PLEG): container finished" podID="41b20e18-d460-4ff6-a1d4-a08137d44b8d" containerID="221feee4104e0b0b29325167b698b444d5bfbf940e3f5078715773badc6d1ab8" exitCode=0 Mar 20 08:06:03 crc kubenswrapper[4749]: I0320 08:06:03.129660 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566566-ttnb7" event={"ID":"41b20e18-d460-4ff6-a1d4-a08137d44b8d","Type":"ContainerDied","Data":"221feee4104e0b0b29325167b698b444d5bfbf940e3f5078715773badc6d1ab8"} Mar 20 08:06:04 crc kubenswrapper[4749]: I0320 08:06:04.556754 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566566-ttnb7" Mar 20 08:06:04 crc kubenswrapper[4749]: I0320 08:06:04.704744 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8w86\" (UniqueName: \"kubernetes.io/projected/41b20e18-d460-4ff6-a1d4-a08137d44b8d-kube-api-access-t8w86\") pod \"41b20e18-d460-4ff6-a1d4-a08137d44b8d\" (UID: \"41b20e18-d460-4ff6-a1d4-a08137d44b8d\") " Mar 20 08:06:04 crc kubenswrapper[4749]: I0320 08:06:04.732104 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41b20e18-d460-4ff6-a1d4-a08137d44b8d-kube-api-access-t8w86" (OuterVolumeSpecName: "kube-api-access-t8w86") pod "41b20e18-d460-4ff6-a1d4-a08137d44b8d" (UID: "41b20e18-d460-4ff6-a1d4-a08137d44b8d"). InnerVolumeSpecName "kube-api-access-t8w86". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:06:04 crc kubenswrapper[4749]: I0320 08:06:04.806994 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8w86\" (UniqueName: \"kubernetes.io/projected/41b20e18-d460-4ff6-a1d4-a08137d44b8d-kube-api-access-t8w86\") on node \"crc\" DevicePath \"\"" Mar 20 08:06:05 crc kubenswrapper[4749]: I0320 08:06:05.153738 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566566-ttnb7" event={"ID":"41b20e18-d460-4ff6-a1d4-a08137d44b8d","Type":"ContainerDied","Data":"43fc1e978d6b61b19d7da3a16473844cb41848ef2e9819bdd3742decf7ddf96d"} Mar 20 08:06:05 crc kubenswrapper[4749]: I0320 08:06:05.154092 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43fc1e978d6b61b19d7da3a16473844cb41848ef2e9819bdd3742decf7ddf96d" Mar 20 08:06:05 crc kubenswrapper[4749]: I0320 08:06:05.153846 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566566-ttnb7" Mar 20 08:06:05 crc kubenswrapper[4749]: I0320 08:06:05.653245 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566560-sn5mp"] Mar 20 08:06:05 crc kubenswrapper[4749]: I0320 08:06:05.664346 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566560-sn5mp"] Mar 20 08:06:06 crc kubenswrapper[4749]: I0320 08:06:06.223462 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6b9b776-816a-4d03-9d2a-1933104588ad" path="/var/lib/kubelet/pods/a6b9b776-816a-4d03-9d2a-1933104588ad/volumes" Mar 20 08:06:09 crc kubenswrapper[4749]: I0320 08:06:09.177524 4749 scope.go:117] "RemoveContainer" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" Mar 20 08:06:09 crc kubenswrapper[4749]: E0320 08:06:09.178343 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:06:11 crc kubenswrapper[4749]: I0320 08:06:11.178002 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:06:11 crc kubenswrapper[4749]: E0320 08:06:11.178835 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:06:14 crc kubenswrapper[4749]: I0320 08:06:14.195189 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:06:14 crc kubenswrapper[4749]: E0320 08:06:14.196540 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:06:24 crc kubenswrapper[4749]: I0320 08:06:24.183460 4749 scope.go:117] "RemoveContainer" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" Mar 20 08:06:24 crc kubenswrapper[4749]: E0320 08:06:24.184424 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:06:25 crc kubenswrapper[4749]: I0320 08:06:25.176899 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:06:25 crc kubenswrapper[4749]: E0320 08:06:25.177299 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:06:26 crc kubenswrapper[4749]: I0320 08:06:26.354811 4749 generic.go:334] "Generic (PLEG): container finished" podID="15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf" containerID="a2730979fad73f1bc2efb5785221f9f32dacbfbb44fbf44ddbad8d07d8d7de51" exitCode=0 Mar 20 08:06:26 crc kubenswrapper[4749]: I0320 08:06:26.354866 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9vbt/must-gather-5qg26" event={"ID":"15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf","Type":"ContainerDied","Data":"a2730979fad73f1bc2efb5785221f9f32dacbfbb44fbf44ddbad8d07d8d7de51"} Mar 20 08:06:26 crc kubenswrapper[4749]: I0320 08:06:26.355728 4749 scope.go:117] "RemoveContainer" containerID="a2730979fad73f1bc2efb5785221f9f32dacbfbb44fbf44ddbad8d07d8d7de51" Mar 20 08:06:26 crc kubenswrapper[4749]: I0320 08:06:26.501977 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-h9vbt_must-gather-5qg26_15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf/gather/0.log" Mar 20 08:06:28 crc kubenswrapper[4749]: I0320 08:06:28.176920 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:06:28 crc kubenswrapper[4749]: E0320 08:06:28.177565 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxqfd_openshift-machine-config-operator(12151228-1cb9-4086-9a62-f4a9583f5f69)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" Mar 20 08:06:34 crc kubenswrapper[4749]: I0320 08:06:34.635208 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-h9vbt/must-gather-5qg26"] Mar 20 08:06:34 crc kubenswrapper[4749]: I0320 08:06:34.636011 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-h9vbt/must-gather-5qg26" podUID="15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf" containerName="copy" containerID="cri-o://e6158cb2d59044f0ab0a1c77b1970922ce4ef675a575bc19c8cbbbbccf00ce8b" gracePeriod=2 Mar 20 08:06:34 crc kubenswrapper[4749]: I0320 08:06:34.642980 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-h9vbt/must-gather-5qg26"] Mar 20 08:06:35 crc kubenswrapper[4749]: I0320 08:06:35.099001 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-h9vbt_must-gather-5qg26_15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf/copy/0.log" Mar 20 08:06:35 crc kubenswrapper[4749]: I0320 08:06:35.100637 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9vbt/must-gather-5qg26" Mar 20 08:06:35 crc kubenswrapper[4749]: I0320 08:06:35.287788 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf-must-gather-output\") pod \"15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf\" (UID: \"15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf\") " Mar 20 08:06:35 crc kubenswrapper[4749]: I0320 08:06:35.288158 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br9vk\" (UniqueName: \"kubernetes.io/projected/15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf-kube-api-access-br9vk\") pod \"15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf\" (UID: \"15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf\") " Mar 20 08:06:35 crc kubenswrapper[4749]: I0320 08:06:35.295865 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf-kube-api-access-br9vk" (OuterVolumeSpecName: "kube-api-access-br9vk") pod "15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf" (UID: "15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf"). InnerVolumeSpecName "kube-api-access-br9vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:06:35 crc kubenswrapper[4749]: I0320 08:06:35.389911 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br9vk\" (UniqueName: \"kubernetes.io/projected/15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf-kube-api-access-br9vk\") on node \"crc\" DevicePath \"\"" Mar 20 08:06:35 crc kubenswrapper[4749]: I0320 08:06:35.418514 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf" (UID: "15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:06:35 crc kubenswrapper[4749]: I0320 08:06:35.450233 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-h9vbt_must-gather-5qg26_15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf/copy/0.log" Mar 20 08:06:35 crc kubenswrapper[4749]: I0320 08:06:35.450586 4749 generic.go:334] "Generic (PLEG): container finished" podID="15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf" containerID="e6158cb2d59044f0ab0a1c77b1970922ce4ef675a575bc19c8cbbbbccf00ce8b" exitCode=143 Mar 20 08:06:35 crc kubenswrapper[4749]: I0320 08:06:35.450640 4749 scope.go:117] "RemoveContainer" containerID="e6158cb2d59044f0ab0a1c77b1970922ce4ef675a575bc19c8cbbbbccf00ce8b" Mar 20 08:06:35 crc kubenswrapper[4749]: I0320 08:06:35.450787 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9vbt/must-gather-5qg26" Mar 20 08:06:35 crc kubenswrapper[4749]: I0320 08:06:35.481996 4749 scope.go:117] "RemoveContainer" containerID="a2730979fad73f1bc2efb5785221f9f32dacbfbb44fbf44ddbad8d07d8d7de51" Mar 20 08:06:35 crc kubenswrapper[4749]: I0320 08:06:35.491052 4749 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 08:06:35 crc kubenswrapper[4749]: I0320 08:06:35.550880 4749 scope.go:117] "RemoveContainer" containerID="e6158cb2d59044f0ab0a1c77b1970922ce4ef675a575bc19c8cbbbbccf00ce8b" Mar 20 08:06:35 crc kubenswrapper[4749]: E0320 08:06:35.553122 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6158cb2d59044f0ab0a1c77b1970922ce4ef675a575bc19c8cbbbbccf00ce8b\": container with ID starting with e6158cb2d59044f0ab0a1c77b1970922ce4ef675a575bc19c8cbbbbccf00ce8b not found: ID does not exist" containerID="e6158cb2d59044f0ab0a1c77b1970922ce4ef675a575bc19c8cbbbbccf00ce8b" Mar 20 08:06:35 crc kubenswrapper[4749]: I0320 08:06:35.553174 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6158cb2d59044f0ab0a1c77b1970922ce4ef675a575bc19c8cbbbbccf00ce8b"} err="failed to get container status \"e6158cb2d59044f0ab0a1c77b1970922ce4ef675a575bc19c8cbbbbccf00ce8b\": rpc error: code = NotFound desc = could not find container \"e6158cb2d59044f0ab0a1c77b1970922ce4ef675a575bc19c8cbbbbccf00ce8b\": container with ID starting with e6158cb2d59044f0ab0a1c77b1970922ce4ef675a575bc19c8cbbbbccf00ce8b not found: ID does not exist" Mar 20 08:06:35 crc kubenswrapper[4749]: I0320 08:06:35.553200 4749 scope.go:117] "RemoveContainer" containerID="a2730979fad73f1bc2efb5785221f9f32dacbfbb44fbf44ddbad8d07d8d7de51" Mar 20 08:06:35 crc kubenswrapper[4749]: E0320 08:06:35.554441 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2730979fad73f1bc2efb5785221f9f32dacbfbb44fbf44ddbad8d07d8d7de51\": container with ID starting with a2730979fad73f1bc2efb5785221f9f32dacbfbb44fbf44ddbad8d07d8d7de51 not found: ID does not exist" containerID="a2730979fad73f1bc2efb5785221f9f32dacbfbb44fbf44ddbad8d07d8d7de51" Mar 20 08:06:35 crc kubenswrapper[4749]: I0320 08:06:35.554479 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2730979fad73f1bc2efb5785221f9f32dacbfbb44fbf44ddbad8d07d8d7de51"} err="failed to get container status \"a2730979fad73f1bc2efb5785221f9f32dacbfbb44fbf44ddbad8d07d8d7de51\": rpc error: code = NotFound desc = could not find container \"a2730979fad73f1bc2efb5785221f9f32dacbfbb44fbf44ddbad8d07d8d7de51\": container with ID starting with a2730979fad73f1bc2efb5785221f9f32dacbfbb44fbf44ddbad8d07d8d7de51 not found: ID does not exist" Mar 20 08:06:35 crc kubenswrapper[4749]: E0320 08:06:35.634086 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15b16cd7_4e64_4fb8_a2e2_8fc7f9e23fbf.slice/crio-03e9e599ffcf1127b748f63ba43b277c87916bd5e57ab4901149cad7045770d4\": RecentStats: unable to find data in memory cache]" Mar 20 08:06:36 crc kubenswrapper[4749]: I0320 08:06:36.190189 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf" path="/var/lib/kubelet/pods/15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf/volumes" Mar 20 08:06:39 crc kubenswrapper[4749]: I0320 08:06:39.177526 4749 scope.go:117] "RemoveContainer" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" Mar 20 08:06:39 crc kubenswrapper[4749]: I0320 08:06:39.178073 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:06:39 crc kubenswrapper[4749]: E0320 08:06:39.178071 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:06:39 crc kubenswrapper[4749]: E0320 08:06:39.178248 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:06:40 crc kubenswrapper[4749]: I0320 08:06:40.178039 4749 scope.go:117] "RemoveContainer" containerID="ea0896570df2e5e44ead081d77eb8ae850d278e89ed0c72cc980fc1dfdd27382" Mar 20 08:06:40 crc kubenswrapper[4749]: I0320 08:06:40.497825 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" event={"ID":"12151228-1cb9-4086-9a62-f4a9583f5f69","Type":"ContainerStarted","Data":"a258c4f4d585e078379b3e5ef0c2e00add11dfb747d3608bdf2359c1caecf14a"} Mar 20 08:06:48 crc kubenswrapper[4749]: I0320 08:06:48.536337 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sg7kr"] Mar 20 08:06:48 crc kubenswrapper[4749]: E0320 08:06:48.539550 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf" containerName="gather" Mar 20 08:06:48 crc kubenswrapper[4749]: I0320 08:06:48.539748 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf" containerName="gather" Mar 20 08:06:48 crc kubenswrapper[4749]: E0320 08:06:48.539923 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b20e18-d460-4ff6-a1d4-a08137d44b8d" containerName="oc" Mar 20 08:06:48 crc kubenswrapper[4749]: I0320 08:06:48.540102 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b20e18-d460-4ff6-a1d4-a08137d44b8d" containerName="oc" Mar 20 08:06:48 crc kubenswrapper[4749]: E0320 08:06:48.540368 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf" containerName="copy" Mar 20 08:06:48 crc kubenswrapper[4749]: I0320 08:06:48.540553 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf" containerName="copy" Mar 20 08:06:48 crc kubenswrapper[4749]: I0320 08:06:48.541069 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="41b20e18-d460-4ff6-a1d4-a08137d44b8d" containerName="oc" Mar 20 08:06:48 crc kubenswrapper[4749]: I0320 08:06:48.541265 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf" containerName="copy" Mar 20 08:06:48 crc kubenswrapper[4749]: I0320 08:06:48.541516 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="15b16cd7-4e64-4fb8-a2e2-8fc7f9e23fbf" containerName="gather" Mar 20 08:06:48 crc kubenswrapper[4749]: I0320 08:06:48.544792 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sg7kr" Mar 20 08:06:48 crc kubenswrapper[4749]: I0320 08:06:48.561193 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sg7kr"] Mar 20 08:06:48 crc kubenswrapper[4749]: I0320 08:06:48.633496 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0c2a76b-a0ed-44f5-9709-b297d7c360e9-catalog-content\") pod \"redhat-marketplace-sg7kr\" (UID: \"d0c2a76b-a0ed-44f5-9709-b297d7c360e9\") " pod="openshift-marketplace/redhat-marketplace-sg7kr" Mar 20 08:06:48 crc kubenswrapper[4749]: I0320 08:06:48.633935 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cxjp\" (UniqueName: \"kubernetes.io/projected/d0c2a76b-a0ed-44f5-9709-b297d7c360e9-kube-api-access-2cxjp\") pod \"redhat-marketplace-sg7kr\" (UID: \"d0c2a76b-a0ed-44f5-9709-b297d7c360e9\") " pod="openshift-marketplace/redhat-marketplace-sg7kr" Mar 20 08:06:48 crc kubenswrapper[4749]: I0320 08:06:48.634052 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0c2a76b-a0ed-44f5-9709-b297d7c360e9-utilities\") pod \"redhat-marketplace-sg7kr\" (UID: \"d0c2a76b-a0ed-44f5-9709-b297d7c360e9\") " pod="openshift-marketplace/redhat-marketplace-sg7kr" Mar 20 08:06:48 crc kubenswrapper[4749]: I0320 08:06:48.735669 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0c2a76b-a0ed-44f5-9709-b297d7c360e9-catalog-content\") pod \"redhat-marketplace-sg7kr\" (UID: \"d0c2a76b-a0ed-44f5-9709-b297d7c360e9\") " pod="openshift-marketplace/redhat-marketplace-sg7kr" Mar 20 08:06:48 crc kubenswrapper[4749]: I0320 08:06:48.735955 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cxjp\" (UniqueName: \"kubernetes.io/projected/d0c2a76b-a0ed-44f5-9709-b297d7c360e9-kube-api-access-2cxjp\") pod \"redhat-marketplace-sg7kr\" (UID: \"d0c2a76b-a0ed-44f5-9709-b297d7c360e9\") " pod="openshift-marketplace/redhat-marketplace-sg7kr" Mar 20 08:06:48 crc kubenswrapper[4749]: I0320 08:06:48.736023 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0c2a76b-a0ed-44f5-9709-b297d7c360e9-utilities\") pod \"redhat-marketplace-sg7kr\" (UID: \"d0c2a76b-a0ed-44f5-9709-b297d7c360e9\") " pod="openshift-marketplace/redhat-marketplace-sg7kr" Mar 20 08:06:48 crc kubenswrapper[4749]: I0320 08:06:48.736678 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0c2a76b-a0ed-44f5-9709-b297d7c360e9-utilities\") pod \"redhat-marketplace-sg7kr\" (UID: \"d0c2a76b-a0ed-44f5-9709-b297d7c360e9\") " pod="openshift-marketplace/redhat-marketplace-sg7kr" Mar 20 08:06:48 crc kubenswrapper[4749]: I0320 08:06:48.736819 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0c2a76b-a0ed-44f5-9709-b297d7c360e9-catalog-content\") pod \"redhat-marketplace-sg7kr\" (UID: \"d0c2a76b-a0ed-44f5-9709-b297d7c360e9\") " pod="openshift-marketplace/redhat-marketplace-sg7kr" Mar 20 08:06:48 crc kubenswrapper[4749]: I0320 08:06:48.768100 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cxjp\" (UniqueName: \"kubernetes.io/projected/d0c2a76b-a0ed-44f5-9709-b297d7c360e9-kube-api-access-2cxjp\") pod \"redhat-marketplace-sg7kr\" (UID: \"d0c2a76b-a0ed-44f5-9709-b297d7c360e9\") " pod="openshift-marketplace/redhat-marketplace-sg7kr" Mar 20 08:06:48 crc kubenswrapper[4749]: I0320 08:06:48.887990 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sg7kr" Mar 20 08:06:49 crc kubenswrapper[4749]: I0320 08:06:49.354858 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sg7kr"] Mar 20 08:06:49 crc kubenswrapper[4749]: I0320 08:06:49.584563 4749 generic.go:334] "Generic (PLEG): container finished" podID="d0c2a76b-a0ed-44f5-9709-b297d7c360e9" containerID="4b1ebac3ab8b3aea1343ba749e04db87aab683eac9cdf078ddfe75ee95522868" exitCode=0 Mar 20 08:06:49 crc kubenswrapper[4749]: I0320 08:06:49.584609 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sg7kr" event={"ID":"d0c2a76b-a0ed-44f5-9709-b297d7c360e9","Type":"ContainerDied","Data":"4b1ebac3ab8b3aea1343ba749e04db87aab683eac9cdf078ddfe75ee95522868"} Mar 20 08:06:49 crc kubenswrapper[4749]: I0320 08:06:49.584634 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sg7kr" event={"ID":"d0c2a76b-a0ed-44f5-9709-b297d7c360e9","Type":"ContainerStarted","Data":"de36f73a220f1670ff33f31c6644588dbebcc02ca0645655d4ccc61ee8af046b"} Mar 20 08:06:50 crc kubenswrapper[4749]: I0320 08:06:50.342706 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-92ftr"] Mar 20 08:06:50 crc kubenswrapper[4749]: I0320 08:06:50.345935 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-92ftr" Mar 20 08:06:50 crc kubenswrapper[4749]: I0320 08:06:50.352632 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-92ftr"] Mar 20 08:06:50 crc kubenswrapper[4749]: I0320 08:06:50.376046 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16f9d5d7-ffac-44f1-9092-22181265525d-utilities\") pod \"redhat-operators-92ftr\" (UID: \"16f9d5d7-ffac-44f1-9092-22181265525d\") " pod="openshift-marketplace/redhat-operators-92ftr" Mar 20 08:06:50 crc kubenswrapper[4749]: I0320 08:06:50.376427 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16f9d5d7-ffac-44f1-9092-22181265525d-catalog-content\") pod \"redhat-operators-92ftr\" (UID: \"16f9d5d7-ffac-44f1-9092-22181265525d\") " pod="openshift-marketplace/redhat-operators-92ftr" Mar 20 08:06:50 crc kubenswrapper[4749]: I0320 08:06:50.376745 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbwhs\" (UniqueName: \"kubernetes.io/projected/16f9d5d7-ffac-44f1-9092-22181265525d-kube-api-access-qbwhs\") pod \"redhat-operators-92ftr\" (UID: \"16f9d5d7-ffac-44f1-9092-22181265525d\") " pod="openshift-marketplace/redhat-operators-92ftr" Mar 20 08:06:50 crc kubenswrapper[4749]: I0320 08:06:50.478579 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16f9d5d7-ffac-44f1-9092-22181265525d-utilities\") pod \"redhat-operators-92ftr\" (UID: \"16f9d5d7-ffac-44f1-9092-22181265525d\") " pod="openshift-marketplace/redhat-operators-92ftr" Mar 20 08:06:50 crc kubenswrapper[4749]: I0320 08:06:50.478628 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16f9d5d7-ffac-44f1-9092-22181265525d-catalog-content\") pod \"redhat-operators-92ftr\" (UID: \"16f9d5d7-ffac-44f1-9092-22181265525d\") " pod="openshift-marketplace/redhat-operators-92ftr" Mar 20 08:06:50 crc kubenswrapper[4749]: I0320 08:06:50.478673 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbwhs\" (UniqueName: \"kubernetes.io/projected/16f9d5d7-ffac-44f1-9092-22181265525d-kube-api-access-qbwhs\") pod \"redhat-operators-92ftr\" (UID: \"16f9d5d7-ffac-44f1-9092-22181265525d\") " pod="openshift-marketplace/redhat-operators-92ftr" Mar 20 08:06:50 crc kubenswrapper[4749]: I0320 08:06:50.479275 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16f9d5d7-ffac-44f1-9092-22181265525d-utilities\") pod \"redhat-operators-92ftr\" (UID: \"16f9d5d7-ffac-44f1-9092-22181265525d\") " pod="openshift-marketplace/redhat-operators-92ftr" Mar 20 08:06:50 crc kubenswrapper[4749]: I0320 08:06:50.479501 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16f9d5d7-ffac-44f1-9092-22181265525d-catalog-content\") pod \"redhat-operators-92ftr\" (UID: \"16f9d5d7-ffac-44f1-9092-22181265525d\") " pod="openshift-marketplace/redhat-operators-92ftr" Mar 20 08:06:50 crc kubenswrapper[4749]: I0320 08:06:50.500399 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbwhs\" (UniqueName: \"kubernetes.io/projected/16f9d5d7-ffac-44f1-9092-22181265525d-kube-api-access-qbwhs\") pod \"redhat-operators-92ftr\" (UID: \"16f9d5d7-ffac-44f1-9092-22181265525d\") " pod="openshift-marketplace/redhat-operators-92ftr" Mar 20 08:06:50 crc kubenswrapper[4749]: I0320 08:06:50.593275 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sg7kr" event={"ID":"d0c2a76b-a0ed-44f5-9709-b297d7c360e9","Type":"ContainerStarted","Data":"ae0db7258e1d55666458b96d5c114ea1b133ad63dfc3fc86f5d7a75ef51cdbe5"} Mar 20 08:06:50 crc kubenswrapper[4749]: I0320 08:06:50.665189 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-92ftr" Mar 20 08:06:51 crc kubenswrapper[4749]: I0320 08:06:51.141917 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-92ftr"] Mar 20 08:06:51 crc kubenswrapper[4749]: W0320 08:06:51.159096 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16f9d5d7_ffac_44f1_9092_22181265525d.slice/crio-149d82f61fbd5f1a06f7d6d39459346c27603013c2e1fba85d0157a4427dbfe6 WatchSource:0}: Error finding container 149d82f61fbd5f1a06f7d6d39459346c27603013c2e1fba85d0157a4427dbfe6: Status 404 returned error can't find the container with id 149d82f61fbd5f1a06f7d6d39459346c27603013c2e1fba85d0157a4427dbfe6 Mar 20 08:06:51 crc kubenswrapper[4749]: I0320 08:06:51.602519 4749 generic.go:334] "Generic (PLEG): container finished" podID="d0c2a76b-a0ed-44f5-9709-b297d7c360e9" containerID="ae0db7258e1d55666458b96d5c114ea1b133ad63dfc3fc86f5d7a75ef51cdbe5" exitCode=0 Mar 20 08:06:51 crc kubenswrapper[4749]: I0320 08:06:51.602569 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sg7kr" event={"ID":"d0c2a76b-a0ed-44f5-9709-b297d7c360e9","Type":"ContainerDied","Data":"ae0db7258e1d55666458b96d5c114ea1b133ad63dfc3fc86f5d7a75ef51cdbe5"} Mar 20 08:06:51 crc kubenswrapper[4749]: I0320 08:06:51.604265 4749 generic.go:334] "Generic (PLEG): container finished" podID="16f9d5d7-ffac-44f1-9092-22181265525d" containerID="ca3f8c880cc122833df09a622f0c0654ee5d63dcc95cb743ee8e5692be0f88a6" exitCode=0 Mar 20 08:06:51 crc kubenswrapper[4749]: I0320 08:06:51.604310 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92ftr" event={"ID":"16f9d5d7-ffac-44f1-9092-22181265525d","Type":"ContainerDied","Data":"ca3f8c880cc122833df09a622f0c0654ee5d63dcc95cb743ee8e5692be0f88a6"} Mar 20 08:06:51 crc kubenswrapper[4749]: I0320 08:06:51.604341 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92ftr" event={"ID":"16f9d5d7-ffac-44f1-9092-22181265525d","Type":"ContainerStarted","Data":"149d82f61fbd5f1a06f7d6d39459346c27603013c2e1fba85d0157a4427dbfe6"} Mar 20 08:06:52 crc kubenswrapper[4749]: I0320 08:06:52.613841 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sg7kr" event={"ID":"d0c2a76b-a0ed-44f5-9709-b297d7c360e9","Type":"ContainerStarted","Data":"dc4a80cda36af7aed56badd7f36024cbaf05d8f932cfa7857fd1ced9c6a8a059"} Mar 20 08:06:52 crc kubenswrapper[4749]: I0320 08:06:52.616149 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92ftr" event={"ID":"16f9d5d7-ffac-44f1-9092-22181265525d","Type":"ContainerStarted","Data":"8abdf51f99f83da29853da59fd1417e5c55543d6de3712ea6f7cc3a888a28328"} Mar 20 08:06:52 crc kubenswrapper[4749]: I0320 08:06:52.631416 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sg7kr" podStartSLOduration=2.211029725 podStartE2EDuration="4.631397432s" podCreationTimestamp="2026-03-20 08:06:48 +0000 UTC" firstStartedPulling="2026-03-20 08:06:49.586649153 +0000 UTC m=+3246.136306800" lastFinishedPulling="2026-03-20 08:06:52.00701682 +0000 UTC m=+3248.556674507" observedRunningTime="2026-03-20 08:06:52.63052051 +0000 UTC m=+3249.180178167" watchObservedRunningTime="2026-03-20 08:06:52.631397432 +0000 UTC m=+3249.181055089" Mar 20 08:06:53 crc kubenswrapper[4749]: I0320 08:06:53.625025 4749 generic.go:334] "Generic (PLEG): container finished" podID="16f9d5d7-ffac-44f1-9092-22181265525d" containerID="8abdf51f99f83da29853da59fd1417e5c55543d6de3712ea6f7cc3a888a28328" exitCode=0 Mar 20 08:06:53 crc kubenswrapper[4749]: I0320 08:06:53.625070 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92ftr" event={"ID":"16f9d5d7-ffac-44f1-9092-22181265525d","Type":"ContainerDied","Data":"8abdf51f99f83da29853da59fd1417e5c55543d6de3712ea6f7cc3a888a28328"} Mar 20 08:06:54 crc kubenswrapper[4749]: I0320 08:06:54.187697 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:06:54 crc kubenswrapper[4749]: I0320 08:06:54.188229 4749 scope.go:117] "RemoveContainer" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" Mar 20 08:06:54 crc kubenswrapper[4749]: E0320 08:06:54.188431 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:06:54 crc kubenswrapper[4749]: E0320 08:06:54.188553 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:06:54 crc kubenswrapper[4749]: I0320 08:06:54.635480 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92ftr" event={"ID":"16f9d5d7-ffac-44f1-9092-22181265525d","Type":"ContainerStarted","Data":"dab0ca054b463fcde288401c8cc134782c8ad6dba39d9045800988e91039dddf"} Mar 20 08:06:54 crc kubenswrapper[4749]: I0320 08:06:54.655243 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-92ftr" podStartSLOduration=2.230641522 podStartE2EDuration="4.655225161s" podCreationTimestamp="2026-03-20 08:06:50 +0000 UTC" firstStartedPulling="2026-03-20 08:06:51.605769179 +0000 UTC m=+3248.155426826" lastFinishedPulling="2026-03-20 08:06:54.030352818 +0000 UTC m=+3250.580010465" observedRunningTime="2026-03-20 08:06:54.651219214 +0000 UTC m=+3251.200876861" watchObservedRunningTime="2026-03-20 08:06:54.655225161 +0000 UTC m=+3251.204882808" Mar 20 08:06:57 crc kubenswrapper[4749]: I0320 08:06:57.068486 4749 scope.go:117] "RemoveContainer" containerID="7c166719e7009a22ece92a2695fff8bc425b2d9889214284173ce3b96d182ebd" Mar 20 08:06:58 crc kubenswrapper[4749]: I0320 08:06:58.888189 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sg7kr" Mar 20 08:06:58 crc kubenswrapper[4749]: I0320 08:06:58.888891 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sg7kr" Mar 20 08:06:58 crc kubenswrapper[4749]: I0320 08:06:58.952742 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sg7kr" Mar 20 08:06:59 crc kubenswrapper[4749]: I0320 08:06:59.741468 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sg7kr" Mar 20 08:06:59 crc kubenswrapper[4749]: I0320 08:06:59.791555 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sg7kr"] Mar 20 08:07:00 crc kubenswrapper[4749]: I0320 08:07:00.666478 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-92ftr" Mar 20 08:07:00 crc kubenswrapper[4749]: I0320 08:07:00.666544 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-92ftr" Mar 20 08:07:01 crc kubenswrapper[4749]: I0320 08:07:01.694055 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sg7kr" podUID="d0c2a76b-a0ed-44f5-9709-b297d7c360e9" containerName="registry-server" containerID="cri-o://dc4a80cda36af7aed56badd7f36024cbaf05d8f932cfa7857fd1ced9c6a8a059" gracePeriod=2 Mar 20 08:07:01 crc kubenswrapper[4749]: I0320 08:07:01.727525 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-92ftr" podUID="16f9d5d7-ffac-44f1-9092-22181265525d" containerName="registry-server" probeResult="failure" output=< Mar 20 08:07:01 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Mar 20 08:07:01 crc kubenswrapper[4749]: > Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.164617 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sg7kr" Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.320094 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cxjp\" (UniqueName: \"kubernetes.io/projected/d0c2a76b-a0ed-44f5-9709-b297d7c360e9-kube-api-access-2cxjp\") pod \"d0c2a76b-a0ed-44f5-9709-b297d7c360e9\" (UID: \"d0c2a76b-a0ed-44f5-9709-b297d7c360e9\") " Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.320253 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0c2a76b-a0ed-44f5-9709-b297d7c360e9-catalog-content\") pod \"d0c2a76b-a0ed-44f5-9709-b297d7c360e9\" (UID: \"d0c2a76b-a0ed-44f5-9709-b297d7c360e9\") " Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.320372 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0c2a76b-a0ed-44f5-9709-b297d7c360e9-utilities\") pod \"d0c2a76b-a0ed-44f5-9709-b297d7c360e9\" (UID: \"d0c2a76b-a0ed-44f5-9709-b297d7c360e9\") " Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.321946 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0c2a76b-a0ed-44f5-9709-b297d7c360e9-utilities" (OuterVolumeSpecName: "utilities") pod "d0c2a76b-a0ed-44f5-9709-b297d7c360e9" (UID: "d0c2a76b-a0ed-44f5-9709-b297d7c360e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.328468 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0c2a76b-a0ed-44f5-9709-b297d7c360e9-kube-api-access-2cxjp" (OuterVolumeSpecName: "kube-api-access-2cxjp") pod "d0c2a76b-a0ed-44f5-9709-b297d7c360e9" (UID: "d0c2a76b-a0ed-44f5-9709-b297d7c360e9"). InnerVolumeSpecName "kube-api-access-2cxjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.372779 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0c2a76b-a0ed-44f5-9709-b297d7c360e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0c2a76b-a0ed-44f5-9709-b297d7c360e9" (UID: "d0c2a76b-a0ed-44f5-9709-b297d7c360e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.423150 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cxjp\" (UniqueName: \"kubernetes.io/projected/d0c2a76b-a0ed-44f5-9709-b297d7c360e9-kube-api-access-2cxjp\") on node \"crc\" DevicePath \"\"" Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.423223 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0c2a76b-a0ed-44f5-9709-b297d7c360e9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.423249 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0c2a76b-a0ed-44f5-9709-b297d7c360e9-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.708929 4749 generic.go:334] "Generic (PLEG): container finished" podID="d0c2a76b-a0ed-44f5-9709-b297d7c360e9" containerID="dc4a80cda36af7aed56badd7f36024cbaf05d8f932cfa7857fd1ced9c6a8a059" exitCode=0 Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.709001 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sg7kr" event={"ID":"d0c2a76b-a0ed-44f5-9709-b297d7c360e9","Type":"ContainerDied","Data":"dc4a80cda36af7aed56badd7f36024cbaf05d8f932cfa7857fd1ced9c6a8a059"} Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.709035 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sg7kr" Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.709071 4749 scope.go:117] "RemoveContainer" containerID="dc4a80cda36af7aed56badd7f36024cbaf05d8f932cfa7857fd1ced9c6a8a059" Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.709052 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sg7kr" event={"ID":"d0c2a76b-a0ed-44f5-9709-b297d7c360e9","Type":"ContainerDied","Data":"de36f73a220f1670ff33f31c6644588dbebcc02ca0645655d4ccc61ee8af046b"} Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.740165 4749 scope.go:117] "RemoveContainer" containerID="ae0db7258e1d55666458b96d5c114ea1b133ad63dfc3fc86f5d7a75ef51cdbe5" Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.774699 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sg7kr"] Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.786871 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sg7kr"] Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.793102 4749 scope.go:117] "RemoveContainer" containerID="4b1ebac3ab8b3aea1343ba749e04db87aab683eac9cdf078ddfe75ee95522868" Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.832534 4749 scope.go:117] "RemoveContainer" containerID="dc4a80cda36af7aed56badd7f36024cbaf05d8f932cfa7857fd1ced9c6a8a059" Mar 20 08:07:02 crc kubenswrapper[4749]: E0320 08:07:02.833308 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc4a80cda36af7aed56badd7f36024cbaf05d8f932cfa7857fd1ced9c6a8a059\": container with ID starting with dc4a80cda36af7aed56badd7f36024cbaf05d8f932cfa7857fd1ced9c6a8a059 not found: ID does not exist" containerID="dc4a80cda36af7aed56badd7f36024cbaf05d8f932cfa7857fd1ced9c6a8a059" Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.833400 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc4a80cda36af7aed56badd7f36024cbaf05d8f932cfa7857fd1ced9c6a8a059"} err="failed to get container status \"dc4a80cda36af7aed56badd7f36024cbaf05d8f932cfa7857fd1ced9c6a8a059\": rpc error: code = NotFound desc = could not find container \"dc4a80cda36af7aed56badd7f36024cbaf05d8f932cfa7857fd1ced9c6a8a059\": container with ID starting with dc4a80cda36af7aed56badd7f36024cbaf05d8f932cfa7857fd1ced9c6a8a059 not found: ID does not exist" Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.833470 4749 scope.go:117] "RemoveContainer" containerID="ae0db7258e1d55666458b96d5c114ea1b133ad63dfc3fc86f5d7a75ef51cdbe5" Mar 20 08:07:02 crc kubenswrapper[4749]: E0320 08:07:02.834144 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae0db7258e1d55666458b96d5c114ea1b133ad63dfc3fc86f5d7a75ef51cdbe5\": container with ID starting with ae0db7258e1d55666458b96d5c114ea1b133ad63dfc3fc86f5d7a75ef51cdbe5 not found: ID does not exist" containerID="ae0db7258e1d55666458b96d5c114ea1b133ad63dfc3fc86f5d7a75ef51cdbe5" Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.834219 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae0db7258e1d55666458b96d5c114ea1b133ad63dfc3fc86f5d7a75ef51cdbe5"} err="failed to get container status \"ae0db7258e1d55666458b96d5c114ea1b133ad63dfc3fc86f5d7a75ef51cdbe5\": rpc error: code = NotFound desc = could not find container \"ae0db7258e1d55666458b96d5c114ea1b133ad63dfc3fc86f5d7a75ef51cdbe5\": container with ID starting with ae0db7258e1d55666458b96d5c114ea1b133ad63dfc3fc86f5d7a75ef51cdbe5 not found: ID does not exist" Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.834270 4749 scope.go:117] "RemoveContainer" containerID="4b1ebac3ab8b3aea1343ba749e04db87aab683eac9cdf078ddfe75ee95522868" Mar 20 08:07:02 crc kubenswrapper[4749]: E0320 08:07:02.834819 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b1ebac3ab8b3aea1343ba749e04db87aab683eac9cdf078ddfe75ee95522868\": container with ID starting with 4b1ebac3ab8b3aea1343ba749e04db87aab683eac9cdf078ddfe75ee95522868 not found: ID does not exist" containerID="4b1ebac3ab8b3aea1343ba749e04db87aab683eac9cdf078ddfe75ee95522868" Mar 20 08:07:02 crc kubenswrapper[4749]: I0320 08:07:02.834888 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b1ebac3ab8b3aea1343ba749e04db87aab683eac9cdf078ddfe75ee95522868"} err="failed to get container status \"4b1ebac3ab8b3aea1343ba749e04db87aab683eac9cdf078ddfe75ee95522868\": rpc error: code = NotFound desc = could not find container \"4b1ebac3ab8b3aea1343ba749e04db87aab683eac9cdf078ddfe75ee95522868\": container with ID starting with 4b1ebac3ab8b3aea1343ba749e04db87aab683eac9cdf078ddfe75ee95522868 not found: ID does not exist" Mar 20 08:07:04 crc kubenswrapper[4749]: I0320 08:07:04.193223 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0c2a76b-a0ed-44f5-9709-b297d7c360e9" path="/var/lib/kubelet/pods/d0c2a76b-a0ed-44f5-9709-b297d7c360e9/volumes" Mar 20 08:07:06 crc kubenswrapper[4749]: I0320 08:07:06.177009 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:07:06 crc kubenswrapper[4749]: E0320 08:07:06.177505 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:07:08 crc kubenswrapper[4749]: I0320 08:07:08.177568 4749 scope.go:117] "RemoveContainer" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" Mar 20 08:07:08 crc kubenswrapper[4749]: E0320 08:07:08.178096 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:07:10 crc kubenswrapper[4749]: I0320 08:07:10.747893 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-92ftr" Mar 20 08:07:10 crc kubenswrapper[4749]: I0320 08:07:10.820400 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-92ftr" Mar 20 08:07:11 crc kubenswrapper[4749]: I0320 08:07:11.320533 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-92ftr"] Mar 20 08:07:11 crc kubenswrapper[4749]: I0320 08:07:11.808079 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-92ftr" podUID="16f9d5d7-ffac-44f1-9092-22181265525d" containerName="registry-server" containerID="cri-o://dab0ca054b463fcde288401c8cc134782c8ad6dba39d9045800988e91039dddf" gracePeriod=2 Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.278640 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-92ftr" Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.399074 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16f9d5d7-ffac-44f1-9092-22181265525d-utilities\") pod \"16f9d5d7-ffac-44f1-9092-22181265525d\" (UID: \"16f9d5d7-ffac-44f1-9092-22181265525d\") " Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.399157 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbwhs\" (UniqueName: \"kubernetes.io/projected/16f9d5d7-ffac-44f1-9092-22181265525d-kube-api-access-qbwhs\") pod \"16f9d5d7-ffac-44f1-9092-22181265525d\" (UID: \"16f9d5d7-ffac-44f1-9092-22181265525d\") " Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.399225 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16f9d5d7-ffac-44f1-9092-22181265525d-catalog-content\") pod \"16f9d5d7-ffac-44f1-9092-22181265525d\" (UID: \"16f9d5d7-ffac-44f1-9092-22181265525d\") " Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.400264 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16f9d5d7-ffac-44f1-9092-22181265525d-utilities" (OuterVolumeSpecName: "utilities") pod "16f9d5d7-ffac-44f1-9092-22181265525d" (UID: "16f9d5d7-ffac-44f1-9092-22181265525d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.406550 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16f9d5d7-ffac-44f1-9092-22181265525d-kube-api-access-qbwhs" (OuterVolumeSpecName: "kube-api-access-qbwhs") pod "16f9d5d7-ffac-44f1-9092-22181265525d" (UID: "16f9d5d7-ffac-44f1-9092-22181265525d"). InnerVolumeSpecName "kube-api-access-qbwhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.501902 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16f9d5d7-ffac-44f1-9092-22181265525d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.501948 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbwhs\" (UniqueName: \"kubernetes.io/projected/16f9d5d7-ffac-44f1-9092-22181265525d-kube-api-access-qbwhs\") on node \"crc\" DevicePath \"\"" Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.558714 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16f9d5d7-ffac-44f1-9092-22181265525d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16f9d5d7-ffac-44f1-9092-22181265525d" (UID: "16f9d5d7-ffac-44f1-9092-22181265525d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.603754 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16f9d5d7-ffac-44f1-9092-22181265525d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.819021 4749 generic.go:334] "Generic (PLEG): container finished" podID="16f9d5d7-ffac-44f1-9092-22181265525d" containerID="dab0ca054b463fcde288401c8cc134782c8ad6dba39d9045800988e91039dddf" exitCode=0 Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.819085 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92ftr" event={"ID":"16f9d5d7-ffac-44f1-9092-22181265525d","Type":"ContainerDied","Data":"dab0ca054b463fcde288401c8cc134782c8ad6dba39d9045800988e91039dddf"} Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.819103 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-92ftr" Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.819123 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92ftr" event={"ID":"16f9d5d7-ffac-44f1-9092-22181265525d","Type":"ContainerDied","Data":"149d82f61fbd5f1a06f7d6d39459346c27603013c2e1fba85d0157a4427dbfe6"} Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.819150 4749 scope.go:117] "RemoveContainer" containerID="dab0ca054b463fcde288401c8cc134782c8ad6dba39d9045800988e91039dddf" Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.854143 4749 scope.go:117] "RemoveContainer" containerID="8abdf51f99f83da29853da59fd1417e5c55543d6de3712ea6f7cc3a888a28328" Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.858573 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-92ftr"] Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.864532 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-92ftr"] Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.891214 4749 scope.go:117] "RemoveContainer" containerID="ca3f8c880cc122833df09a622f0c0654ee5d63dcc95cb743ee8e5692be0f88a6" Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.908639 4749 scope.go:117] "RemoveContainer" containerID="dab0ca054b463fcde288401c8cc134782c8ad6dba39d9045800988e91039dddf" Mar 20 08:07:12 crc kubenswrapper[4749]: E0320 08:07:12.909046 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dab0ca054b463fcde288401c8cc134782c8ad6dba39d9045800988e91039dddf\": container with ID starting with dab0ca054b463fcde288401c8cc134782c8ad6dba39d9045800988e91039dddf not found: ID does not exist" containerID="dab0ca054b463fcde288401c8cc134782c8ad6dba39d9045800988e91039dddf" Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.909197 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dab0ca054b463fcde288401c8cc134782c8ad6dba39d9045800988e91039dddf"} err="failed to get container status \"dab0ca054b463fcde288401c8cc134782c8ad6dba39d9045800988e91039dddf\": rpc error: code = NotFound desc = could not find container \"dab0ca054b463fcde288401c8cc134782c8ad6dba39d9045800988e91039dddf\": container with ID starting with dab0ca054b463fcde288401c8cc134782c8ad6dba39d9045800988e91039dddf not found: ID does not exist" Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.909227 4749 scope.go:117] "RemoveContainer" containerID="8abdf51f99f83da29853da59fd1417e5c55543d6de3712ea6f7cc3a888a28328" Mar 20 08:07:12 crc kubenswrapper[4749]: E0320 08:07:12.909709 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8abdf51f99f83da29853da59fd1417e5c55543d6de3712ea6f7cc3a888a28328\": container with ID starting with 8abdf51f99f83da29853da59fd1417e5c55543d6de3712ea6f7cc3a888a28328 not found: ID does not exist" containerID="8abdf51f99f83da29853da59fd1417e5c55543d6de3712ea6f7cc3a888a28328" Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.909760 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8abdf51f99f83da29853da59fd1417e5c55543d6de3712ea6f7cc3a888a28328"} err="failed to get container status \"8abdf51f99f83da29853da59fd1417e5c55543d6de3712ea6f7cc3a888a28328\": rpc error: code = NotFound desc = could not find container \"8abdf51f99f83da29853da59fd1417e5c55543d6de3712ea6f7cc3a888a28328\": container with ID starting with 8abdf51f99f83da29853da59fd1417e5c55543d6de3712ea6f7cc3a888a28328 not found: ID does not exist" Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.909789 4749 scope.go:117] "RemoveContainer" containerID="ca3f8c880cc122833df09a622f0c0654ee5d63dcc95cb743ee8e5692be0f88a6" Mar 20 08:07:12 crc kubenswrapper[4749]: E0320 08:07:12.910104 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca3f8c880cc122833df09a622f0c0654ee5d63dcc95cb743ee8e5692be0f88a6\": container with ID starting with ca3f8c880cc122833df09a622f0c0654ee5d63dcc95cb743ee8e5692be0f88a6 not found: ID does not exist" containerID="ca3f8c880cc122833df09a622f0c0654ee5d63dcc95cb743ee8e5692be0f88a6" Mar 20 08:07:12 crc kubenswrapper[4749]: I0320 08:07:12.910132 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca3f8c880cc122833df09a622f0c0654ee5d63dcc95cb743ee8e5692be0f88a6"} err="failed to get container status \"ca3f8c880cc122833df09a622f0c0654ee5d63dcc95cb743ee8e5692be0f88a6\": rpc error: code = NotFound desc = could not find container \"ca3f8c880cc122833df09a622f0c0654ee5d63dcc95cb743ee8e5692be0f88a6\": container with ID starting with ca3f8c880cc122833df09a622f0c0654ee5d63dcc95cb743ee8e5692be0f88a6 not found: ID does not exist" Mar 20 08:07:14 crc kubenswrapper[4749]: I0320 08:07:14.196902 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16f9d5d7-ffac-44f1-9092-22181265525d" path="/var/lib/kubelet/pods/16f9d5d7-ffac-44f1-9092-22181265525d/volumes" Mar 20 08:07:19 crc kubenswrapper[4749]: I0320 08:07:19.178160 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:07:19 crc kubenswrapper[4749]: E0320 08:07:19.179048 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:07:22 crc kubenswrapper[4749]: I0320 08:07:22.177075 4749 scope.go:117] "RemoveContainer" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" Mar 20 08:07:22 crc kubenswrapper[4749]: E0320 08:07:22.178433 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:07:32 crc kubenswrapper[4749]: I0320 08:07:32.178332 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:07:32 crc kubenswrapper[4749]: E0320 08:07:32.178984 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:07:36 crc kubenswrapper[4749]: I0320 08:07:36.178409 4749 scope.go:117] "RemoveContainer" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" Mar 20 08:07:36 crc kubenswrapper[4749]: E0320 08:07:36.179081 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:07:47 crc kubenswrapper[4749]: I0320 08:07:47.176840 4749 scope.go:117] "RemoveContainer" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" Mar 20 08:07:47 crc kubenswrapper[4749]: I0320 08:07:47.178669 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:07:47 crc kubenswrapper[4749]: E0320 08:07:47.178969 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:07:47 crc kubenswrapper[4749]: E0320 08:07:47.178991 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:07:59 crc kubenswrapper[4749]: I0320 08:07:59.177616 4749 scope.go:117] "RemoveContainer" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" Mar 20 08:07:59 crc kubenswrapper[4749]: E0320 08:07:59.178780 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:08:00 crc kubenswrapper[4749]: I0320 08:08:00.178252 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:08:00 crc kubenswrapper[4749]: E0320 08:08:00.179087 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:08:00 crc kubenswrapper[4749]: I0320 08:08:00.189737 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566568-cr7hm"] Mar 20 08:08:00 crc kubenswrapper[4749]: E0320 08:08:00.190061 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16f9d5d7-ffac-44f1-9092-22181265525d" containerName="registry-server" Mar 20 08:08:00 crc kubenswrapper[4749]: I0320 08:08:00.190080 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="16f9d5d7-ffac-44f1-9092-22181265525d" containerName="registry-server" Mar 20 08:08:00 crc kubenswrapper[4749]: E0320 08:08:00.190089 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0c2a76b-a0ed-44f5-9709-b297d7c360e9" containerName="registry-server" Mar 20 08:08:00 crc kubenswrapper[4749]: I0320 08:08:00.190095 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0c2a76b-a0ed-44f5-9709-b297d7c360e9" containerName="registry-server" Mar 20 08:08:00 crc kubenswrapper[4749]: E0320 08:08:00.190114 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0c2a76b-a0ed-44f5-9709-b297d7c360e9" containerName="extract-utilities" Mar 20 08:08:00 crc kubenswrapper[4749]: I0320 08:08:00.190121 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0c2a76b-a0ed-44f5-9709-b297d7c360e9" containerName="extract-utilities" Mar 20 08:08:00 crc kubenswrapper[4749]: E0320 08:08:00.190156 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16f9d5d7-ffac-44f1-9092-22181265525d" containerName="extract-content" Mar 20 08:08:00 crc kubenswrapper[4749]: I0320 08:08:00.190162 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="16f9d5d7-ffac-44f1-9092-22181265525d" containerName="extract-content" Mar 20 08:08:00 crc kubenswrapper[4749]: E0320 08:08:00.190173 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16f9d5d7-ffac-44f1-9092-22181265525d" containerName="extract-utilities" Mar 20 08:08:00 crc kubenswrapper[4749]: I0320 08:08:00.190180 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="16f9d5d7-ffac-44f1-9092-22181265525d" containerName="extract-utilities" Mar 20 08:08:00 crc kubenswrapper[4749]: E0320 08:08:00.190192 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0c2a76b-a0ed-44f5-9709-b297d7c360e9" containerName="extract-content" Mar 20 08:08:00 crc kubenswrapper[4749]: I0320 08:08:00.190198 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0c2a76b-a0ed-44f5-9709-b297d7c360e9" containerName="extract-content" Mar 20 08:08:00 crc kubenswrapper[4749]: I0320 08:08:00.190408 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="16f9d5d7-ffac-44f1-9092-22181265525d" containerName="registry-server" Mar 20 08:08:00 crc kubenswrapper[4749]: I0320 08:08:00.190424 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0c2a76b-a0ed-44f5-9709-b297d7c360e9" containerName="registry-server" Mar 20 08:08:00 crc kubenswrapper[4749]: I0320 08:08:00.191076 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566568-cr7hm" Mar 20 08:08:00 crc kubenswrapper[4749]: I0320 08:08:00.193234 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:08:00 crc kubenswrapper[4749]: I0320 08:08:00.193735 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6rhdf" Mar 20 08:08:00 crc kubenswrapper[4749]: I0320 08:08:00.198969 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:08:00 crc kubenswrapper[4749]: I0320 08:08:00.202964 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566568-cr7hm"] Mar 20 08:08:00 crc kubenswrapper[4749]: I0320 08:08:00.351381 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlt6q\" (UniqueName: \"kubernetes.io/projected/f9e08763-f18b-4d35-9d8a-707aca231b8e-kube-api-access-zlt6q\") pod \"auto-csr-approver-29566568-cr7hm\" (UID: \"f9e08763-f18b-4d35-9d8a-707aca231b8e\") " pod="openshift-infra/auto-csr-approver-29566568-cr7hm" Mar 20 08:08:00 crc kubenswrapper[4749]: I0320 08:08:00.454854 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlt6q\" (UniqueName: \"kubernetes.io/projected/f9e08763-f18b-4d35-9d8a-707aca231b8e-kube-api-access-zlt6q\") pod \"auto-csr-approver-29566568-cr7hm\" (UID: \"f9e08763-f18b-4d35-9d8a-707aca231b8e\") " pod="openshift-infra/auto-csr-approver-29566568-cr7hm" Mar 20 08:08:00 crc kubenswrapper[4749]: I0320 08:08:00.477219 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlt6q\" (UniqueName: \"kubernetes.io/projected/f9e08763-f18b-4d35-9d8a-707aca231b8e-kube-api-access-zlt6q\") pod \"auto-csr-approver-29566568-cr7hm\" (UID: \"f9e08763-f18b-4d35-9d8a-707aca231b8e\") " pod="openshift-infra/auto-csr-approver-29566568-cr7hm" Mar 20 08:08:00 crc kubenswrapper[4749]: I0320 08:08:00.514501 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566568-cr7hm" Mar 20 08:08:00 crc kubenswrapper[4749]: I0320 08:08:00.996872 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566568-cr7hm"] Mar 20 08:08:01 crc kubenswrapper[4749]: I0320 08:08:01.017827 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 08:08:01 crc kubenswrapper[4749]: I0320 08:08:01.272230 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566568-cr7hm" event={"ID":"f9e08763-f18b-4d35-9d8a-707aca231b8e","Type":"ContainerStarted","Data":"048acc27caf0df796013b96f28a9053168d264452f5747ea608a19f9f414dc1f"} Mar 20 08:08:02 crc kubenswrapper[4749]: I0320 08:08:02.280794 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566568-cr7hm" event={"ID":"f9e08763-f18b-4d35-9d8a-707aca231b8e","Type":"ContainerStarted","Data":"1923ce68cacb2b282386270bf599c68704b84d64c7afaef7f88ad6d43ff4cd6a"} Mar 20 08:08:02 crc kubenswrapper[4749]: I0320 08:08:02.304214 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566568-cr7hm" podStartSLOduration=1.452297965 podStartE2EDuration="2.304189551s" podCreationTimestamp="2026-03-20 08:08:00 +0000 UTC" firstStartedPulling="2026-03-20 08:08:01.017457107 +0000 UTC m=+3317.567114784" lastFinishedPulling="2026-03-20 08:08:01.869348683 +0000 UTC m=+3318.419006370" observedRunningTime="2026-03-20 08:08:02.300938023 +0000 UTC m=+3318.850595710" watchObservedRunningTime="2026-03-20 08:08:02.304189551 +0000 UTC m=+3318.853847228" Mar 20 08:08:03 crc kubenswrapper[4749]: I0320 08:08:03.288450 4749 generic.go:334] "Generic (PLEG): container finished" podID="f9e08763-f18b-4d35-9d8a-707aca231b8e" containerID="1923ce68cacb2b282386270bf599c68704b84d64c7afaef7f88ad6d43ff4cd6a" exitCode=0 Mar 20 08:08:03 crc kubenswrapper[4749]: I0320 08:08:03.288494 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566568-cr7hm" event={"ID":"f9e08763-f18b-4d35-9d8a-707aca231b8e","Type":"ContainerDied","Data":"1923ce68cacb2b282386270bf599c68704b84d64c7afaef7f88ad6d43ff4cd6a"} Mar 20 08:08:04 crc kubenswrapper[4749]: I0320 08:08:04.678255 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566568-cr7hm" Mar 20 08:08:04 crc kubenswrapper[4749]: I0320 08:08:04.841406 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlt6q\" (UniqueName: \"kubernetes.io/projected/f9e08763-f18b-4d35-9d8a-707aca231b8e-kube-api-access-zlt6q\") pod \"f9e08763-f18b-4d35-9d8a-707aca231b8e\" (UID: \"f9e08763-f18b-4d35-9d8a-707aca231b8e\") " Mar 20 08:08:04 crc kubenswrapper[4749]: I0320 08:08:04.848873 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9e08763-f18b-4d35-9d8a-707aca231b8e-kube-api-access-zlt6q" (OuterVolumeSpecName: "kube-api-access-zlt6q") pod "f9e08763-f18b-4d35-9d8a-707aca231b8e" (UID: "f9e08763-f18b-4d35-9d8a-707aca231b8e"). InnerVolumeSpecName "kube-api-access-zlt6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:08:04 crc kubenswrapper[4749]: I0320 08:08:04.942942 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlt6q\" (UniqueName: \"kubernetes.io/projected/f9e08763-f18b-4d35-9d8a-707aca231b8e-kube-api-access-zlt6q\") on node \"crc\" DevicePath \"\"" Mar 20 08:08:05 crc kubenswrapper[4749]: I0320 08:08:05.311156 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566568-cr7hm" event={"ID":"f9e08763-f18b-4d35-9d8a-707aca231b8e","Type":"ContainerDied","Data":"048acc27caf0df796013b96f28a9053168d264452f5747ea608a19f9f414dc1f"} Mar 20 08:08:05 crc kubenswrapper[4749]: I0320 08:08:05.311217 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566568-cr7hm" Mar 20 08:08:05 crc kubenswrapper[4749]: I0320 08:08:05.311233 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="048acc27caf0df796013b96f28a9053168d264452f5747ea608a19f9f414dc1f" Mar 20 08:08:05 crc kubenswrapper[4749]: I0320 08:08:05.747457 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566562-gnlpf"] Mar 20 08:08:05 crc kubenswrapper[4749]: I0320 08:08:05.753127 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566562-gnlpf"] Mar 20 08:08:06 crc kubenswrapper[4749]: I0320 08:08:06.210698 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f4d5f3a-39d8-403e-9dfc-6155204cadf0" path="/var/lib/kubelet/pods/4f4d5f3a-39d8-403e-9dfc-6155204cadf0/volumes" Mar 20 08:08:12 crc kubenswrapper[4749]: I0320 08:08:12.177616 4749 scope.go:117] "RemoveContainer" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" Mar 20 08:08:12 crc kubenswrapper[4749]: E0320 08:08:12.178686 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:08:14 crc kubenswrapper[4749]: I0320 08:08:14.182458 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:08:14 crc kubenswrapper[4749]: E0320 08:08:14.182938 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:08:19 crc kubenswrapper[4749]: I0320 08:08:19.567466 4749 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-9lpwm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 08:08:19 crc kubenswrapper[4749]: I0320 08:08:19.567813 4749 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-9lpwm container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 08:08:19 crc kubenswrapper[4749]: I0320 08:08:19.568055 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-9lpwm" podUID="ab07be1c-a7c8-4310-b2be-7dea01a4a55b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 08:08:19 crc kubenswrapper[4749]: I0320 08:08:19.568101 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-9lpwm" podUID="ab07be1c-a7c8-4310-b2be-7dea01a4a55b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 08:08:27 crc kubenswrapper[4749]: I0320 08:08:27.178717 4749 scope.go:117] "RemoveContainer" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" Mar 20 08:08:27 crc kubenswrapper[4749]: E0320 08:08:27.180110 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:08:28 crc kubenswrapper[4749]: I0320 08:08:28.178047 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:08:28 crc kubenswrapper[4749]: E0320 08:08:28.179343 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:08:42 crc kubenswrapper[4749]: I0320 08:08:42.177595 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:08:42 crc kubenswrapper[4749]: I0320 08:08:42.178973 4749 scope.go:117] "RemoveContainer" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" Mar 20 08:08:42 crc kubenswrapper[4749]: E0320 08:08:42.179317 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:08:42 crc kubenswrapper[4749]: E0320 08:08:42.179317 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:08:54 crc kubenswrapper[4749]: I0320 08:08:54.181357 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:08:54 crc kubenswrapper[4749]: E0320 08:08:54.181963 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:08:57 crc kubenswrapper[4749]: I0320 08:08:57.177967 4749 scope.go:117] "RemoveContainer" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" Mar 20 08:08:57 crc kubenswrapper[4749]: E0320 08:08:57.178596 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:08:57 crc kubenswrapper[4749]: I0320 08:08:57.241700 4749 scope.go:117] "RemoveContainer" containerID="d91dfc559d5090b384a320f468e9284817d13acb128fdbcccfe3aee2cf549b97" Mar 20 08:09:04 crc kubenswrapper[4749]: I0320 08:09:04.514767 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:09:04 crc kubenswrapper[4749]: I0320 08:09:04.515545 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:09:09 crc kubenswrapper[4749]: I0320 08:09:09.177527 4749 scope.go:117] "RemoveContainer" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" Mar 20 08:09:09 crc kubenswrapper[4749]: I0320 08:09:09.178167 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:09:09 crc kubenswrapper[4749]: E0320 08:09:09.178469 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:09:09 crc kubenswrapper[4749]: E0320 08:09:09.178579 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:09:21 crc kubenswrapper[4749]: I0320 08:09:21.177178 4749 scope.go:117] "RemoveContainer" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" Mar 20 08:09:21 crc kubenswrapper[4749]: E0320 08:09:21.178034 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-server-0_openstack(8db06e36-0b00-4157-9345-69449da3e85f)\"" pod="openstack/rabbitmq-server-0" podUID="8db06e36-0b00-4157-9345-69449da3e85f" Mar 20 08:09:23 crc kubenswrapper[4749]: I0320 08:09:23.177566 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:09:24 crc kubenswrapper[4749]: I0320 08:09:24.187688 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerStarted","Data":"2d9a479f8ea3a32c9d996ab8749b8bc30b57575aa4ff4dd260234015036f2b9c"} Mar 20 08:09:24 crc kubenswrapper[4749]: I0320 08:09:24.189343 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:09:27 crc kubenswrapper[4749]: I0320 08:09:27.215845 4749 generic.go:334] "Generic (PLEG): container finished" podID="8b9b402f-2d95-48f5-98d8-497d90956ba2" containerID="2d9a479f8ea3a32c9d996ab8749b8bc30b57575aa4ff4dd260234015036f2b9c" exitCode=0 Mar 20 08:09:27 crc kubenswrapper[4749]: I0320 08:09:27.215958 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b9b402f-2d95-48f5-98d8-497d90956ba2","Type":"ContainerDied","Data":"2d9a479f8ea3a32c9d996ab8749b8bc30b57575aa4ff4dd260234015036f2b9c"} Mar 20 08:09:27 crc kubenswrapper[4749]: I0320 08:09:27.216920 4749 scope.go:117] "RemoveContainer" containerID="9046ebdbe872cb7e98b516adea42a9b9e20a0a9303575f764937ff5f20a07ca8" Mar 20 08:09:27 crc kubenswrapper[4749]: I0320 08:09:27.217915 4749 scope.go:117] "RemoveContainer" containerID="2d9a479f8ea3a32c9d996ab8749b8bc30b57575aa4ff4dd260234015036f2b9c" Mar 20 08:09:27 crc kubenswrapper[4749]: E0320 08:09:27.218345 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"rabbitmq\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=rabbitmq pod=rabbitmq-cell1-server-0_openstack(8b9b402f-2d95-48f5-98d8-497d90956ba2)\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8b9b402f-2d95-48f5-98d8-497d90956ba2" Mar 20 08:09:34 crc kubenswrapper[4749]: I0320 08:09:34.190775 4749 scope.go:117] "RemoveContainer" containerID="621a816df4454c8ecf233dee141dafaa70aaf28eb681fbb44d2aa60ac4efe015" Mar 20 08:09:34 crc kubenswrapper[4749]: I0320 08:09:34.514883 4749 patch_prober.go:28] interesting pod/machine-config-daemon-fxqfd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:09:34 crc kubenswrapper[4749]: I0320 08:09:34.515264 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxqfd" podUID="12151228-1cb9-4086-9a62-f4a9583f5f69" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:09:35 crc kubenswrapper[4749]: I0320 08:09:35.294004 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8db06e36-0b00-4157-9345-69449da3e85f","Type":"ContainerStarted","Data":"6e70d8b6ab92c09177d4d25aeadad1a5a7e4a558674173c891ff4e2c0fea464e"} Mar 20 08:09:35 crc kubenswrapper[4749]: I0320 08:09:35.295210 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515157200313024442 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015157200314017360 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015157171240016510 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015157171240015460 5ustar corecore